• Urgent role_Data Engineer_ Dallas TX (initially remote)_6 month plus C2

    From Satish Chaturvedi@21:1/5 to All on Wed Sep 1 06:56:30 2021
    Hello,
    Hope you are doing well.
    I have an urgent position for Data Engineer. Please share profile to satish.c@idctechnologies.com
    Job Title:- Data Engineer
    Location:- Dallas TX (initially remote)
    Duration:- 6+ months C2C
    Mode of Interview:- Phone, Skype
    Visa: - Only H4 EAD/ GC EAD / GC / USC/H1B on C2C
    Client – Thomson Reuters/TCS
    The Senior Data Engineer will be responsible for writing SQL, Python and PySpark scripts to be used in API calls to pull data from multiple disparate systems and databases. The source data may include Analytic System data (Google Analytics, Adobe
    Analytics), 3rd party systems, CSV files, CSS Feeds, etc. This individual would also assist with cleaning up the data so it's in readily accessible format for the BI systems. The Senior Data Engineer will contribute expertise, embrace emerging trends and
    provide overall guidance on best practices across all of News Corp business and technology groups. The position will require the ability to multitask and work independently, as well as work collaboratively with teams, some of which may be geographically
    distributed.

    ● Be accountable for planning, designing, developing and implementing applications to provide services to the global organisation
    ● Good experience and understanding of data models for the data warehouse and BI data marts
    ● Technical support of Data warehouse and BI tools & infrastructure
    ● Identification, analysis & resolution of production & development bugs ● Support the release process including completing & reviewing documentation
    ● Configure data mappings & transformations to orchestrate data integration & validation
    ● Design, build & test Visualizations (Google Data Studio , Tableau), ETL processes using GCP Composer, Airflow & SQL or any ETL tool for the corporate data warehouse on Google BigQuery or any databases.
    ● Develop domain knowledge and become subject matter experts in key business verticals eg. Media & Publications.
    ● Document solutions, tools, processes & create/support test plans with hands-on testing
    ● Peer review work developed by other data engineers within the team
    ● Establish good working relationships, communication channels with relevant departments & senior stakeholders
    ● Require a passion for all things automation and stickler for efficiency

    Required Experience
    ● Building ETL data pipelines using SQL, Python for API calls to get data.
    ● Expertise in Google Biq Query, PySpark, SQL and relational databases (PostgreSQL, MySQL). Scripting experience with Shell & Python.
    ● Experience working with Atlassian products such as Jira and Confluence ● Experience in managing code within the SDLC methodology
    ● A firm understanding of code source control concepts in particular using GIT repository
    ● Proven ability to write both customer facing and technical documentation
    ● Experience in Google Analytics & Adobe Analytics a plus
    ● Hands on experience in developing visualization using Google Data Studio.
    Thank you
    Satish Chaturvedi

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)