Data Engineering Specialist at Internews (Apply Now!)

Internews

Data Engineering Specialist at Internews (Apply Now!)

Job Type: Limited-term
Job Category: Information and Communications Technology
Hiring Organization: Internews
Education Level: Not specified
Experience: 10+ years
Deadline: May 1, 2024
Location: Remote (United States or United Kingdom)

About the Opportunity:

Internews seeks a Data Engineering Specialist for the Media Viability Accelerator (MVA) program. The specialist will lead the development, testing, and deployment of data components for the MVA platform, ensuring its cutting-edge functionality.

About the MVA:

The Media Viability Accelerator (MVA) Forward program aims to support independent media by developing an online platform that enhances their business performance. Built on Azure and using ASP.NET Core, Angular, Bootstrap, and Power BI, the platform offers tailored insights to transform media businesses.

Logistics:

This is a remote-based role, open to candidates in the United States or United Kingdom. The position is limited-term, with an expected end date of September 24, 2026.

Responsibilities:

  • Data Pipelines: Maintain and extend analytics pipelines with Azure Synapse and Azure Data Factory.
  • Database Systems: Manage database systems for efficient querying and caching using Azure SQL and Synapse Analytics.
  • Optimizing Performance: Ensure maximum accessibility, speed, and scalability of connectors and queues.
  • Data Storage: Safeguard data through encrypted data stores and disaster recovery strategies.
  • Automated Task Management: Oversee scheduled tasks for user registration and data scraping operations.
  • API Integration and Python Scripting: Develop integrations with platforms such as Google Analytics and maintain Python scripts for web scraping.
  • Integration with Back End: Collaborate with Back-End Web Development Specialist to integrate data-gathering processes.
  • Documentation: Maintain documentation of processes and components to support team collaboration.
  • Code Maintenance and Improvement: Enhance the existing codebase for optimal performance and scalability.
  • Testing: Implement continuous logging and testing strategies to ensure high-quality outputs.
  • Actions and Automation: Schedule and execute pipeline tasks, manage updates, and track automated workflows.

Qualifications:

Required:

  • Minimum of 10 years of relevant experience, including 3 years as a Data Engineer.
  • Microsoft Azure Data Engineer certification.
  • Database and data warehousing expertise.
  • Proficiency in Python scripting and RESTful API integration.
  • Experience with CI/CD practices and version control systems like Git.
  • Excellent communication and interpersonal skills.

Preferred:

  • Experience with performance metrics for data visualization.
  • Ability to troubleshoot complex issues and develop effective solutions.
  • Extensive experience with data security and compliance best practices.
  • Familiarity with news media.

To apply for this job please visit phf.tbe.taleo.net.

Frenz Lifestyle & Wellness Blog

For Lifestyle trends, tips, and best product reviews

Join Our Subscriber List Today!

This will close in 0 seconds