Data Scientist (Big Data Engineer) 2


Job Details

Office of the Attorney General of Texas requires the services of 3 Data Scientist (Big Data Engineer) 2, hereafter referred to as Candidate(s), who meets the general qualifications of Data Scientist (Big Data Engineer) 2, Data/Database Administration and the specifications outlined in this document for the Office of the Attorney General of Texas.

All work products resulting from the project shall be considered "works made for hire" and are the property of the Office of the Attorney General of Texas and may include pre-selection requirements that potential Vendors (and their Candidates) submit to and satisfy criminal background checks as authorized by Texas law. Office of the Attorney General of Texas will pay no fees for interviews or discussions, which occur during the process of selecting a Candidate(s).

The Office of the Attorney General's Chief Data Office (CDO) is seeking a dynamic and visionary Data Scientist to join our team and support our System Modernization efforts for the Child Support Division. This individual must be able to work in a heavily technical environment, preparing and optimizing data for Snowflake and utilizing SAS Viya to build comprehensive federal and state reports. You will play a pivotal role in transforming, analyzing, and leveraging our data assets to drive strategic initiatives and business outcomes. You must be comfortable analyzing data, developing predictive modeling algorithms, and can communicate your findings through visualizations enabling the discovery of solutions to business problems. As a Data Scientist your role will be impactful, visible, and rewarding.

Responsibilities will include:

  • Data Conversion and Reporting Support for System Modernization Efforts
  • Data Transformation and Integration: Prepare and optimize data for migration to Snowflake and SAS Viya platforms, ensuring seamless integration and functionality by creating data transformation processes using ETL, SQL, Python, and R.
  • Develop Federal and State Reports: Build comprehensive reports that meet federal and state requirements using Snowflake and SAS Viya, ensuring accuracy and compliance.
  • Scrum Team Collaboration: Work as a member of an agile team to deliver new features and functions, delivering best-in-class value-based technology solutions.
  • Data Quality Management: Develop and implement databases, ETL processes, data collection systems, and data quality strategies that optimize statistical efficiency, accuracy, and quality.
  • Problem Examination and Resolution: Examine problems within the Data Intelligence space using ETL, Lambda, and Glue, and implement necessary changes to ensure data quality improvement.
  • Data Analytics and Insights: Utilize advanced data analytics techniques to support strategic decision-making, ensuring data integrity, quality, and timeliness of results.

The above job description and requirements are general in nature and may be subject to change based on the specific needs and requirements of the organization and project.

II. CANDIDATE SKILLS AND QUALIFICATIONS
Minimum Requirements:
Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity. Years Required/Preferred Experience 4 Required Proven experience in data conversion and report building using Snowflake and SAS Viya. 4 Required Demonstrated experience in data transformation processes using ETL, SQL, Python, and R. 4 Required Experience working with data analytics and business intelligence tools. 4 Required Experience working in a Scrum or Agile development environment. 4 Required Proficiency in ETL processes and tools such as AWS Glue and Lambda. 4 Required Strong knowledge of database management and data warehousing concepts. 4 Required Expertise in SQL for data querying and manipulation. 4 Required Proven experience in data conversion and report building using Snowflake and SAS Viya. 4 Required Demonstrated experience in data transformation processes using ETL, SQL, Python, and R. 4 Required Experience working with data analytics and business intelligence tools. 4 Required Experience working in a Scrum or Agile development environment. 4 Required Proficiency in ETL processes and tools such as AWS Glue and Lambda.





 Smart IMS.

 07/29/2024

 Austin,TX