Job Details

Data Architect - Remote / Telecommute

  2025-10-01     Cynet Systems     all cities,AK  
Description:

Overview

Job Description: Pay Range: $07hr - $75hr

Responsibilities
  • Data Pipeline Development: Design, build, and maintain scalable data pipelines to ingest, process, and transform structured and unstructured data.
  • Data Modeling: Create optimized data models to support analytics, reporting, and machine learning workflows.
  • ETL/ELT Processes: Develop and manage ETL/ELT workflows to ensure clean, reliable, and high-quality data.
  • Database Management: Work with relational and NoSQL databases to ensure efficient storage and retrieval of large datasets.
  • Cloud Data Solutions: Implement and optimize data solutions on cloud platforms like AWS, Azure, or GCP.
  • Data Quality & Governance: Ensure data integrity, security, compliance, and quality across systems.
  • Collaboration: Partner with data scientists, analysts, and software engineers to deliver reliable data infrastructure.
  • Automation: Streamline data processes using orchestration tools and automation frameworks.
  • Monitoring & Optimization: Implement monitoring, logging, and performance tuning of data systems.
  • Documentation: Maintain detailed documentation of data pipelines, architecture, and workflows.
Skills
  • Programming Skills: Proficiency in Python, SQL, and familiarity with Java/Scala.
  • Data Pipelines & ETL: Experience with ETL tools (Airflow, DBT, Informatica, Talend).
  • Big Data Frameworks: Knowledge of Spark, Hadoop, Kafka, or Flink.
  • Data Warehousing: Hands-on experience with Snowflake, Redshift, BigQuery, or Synapse.
  • Cloud Platforms: Proficiency in AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse), or GCP (BigQuery, Dataflow).
  • Databases: Strong experience with relational databases (PostgreSQL, MySQL, Oracle) and NoSQL databases (MongoDB, Cassandra).
  • Data Modeling: Expertise in designing star/snowflake schemas, OLTP/OLAP systems.
  • DevOps & Version Control: Familiarity with Git, CI/CD pipelines, and Infrastructure as Code (Terraform).
  • Data Governance & Security: Knowledge of GDPR, HIPAA, encryption, role-based access controls.
  • Analytical Skills: Strong problem-solving and optimization skills in handling big data.
  • Collaboration & Communication: Ability to work in cross-functional teams and clearly document technical processes.

#J-18808-Ljbffr


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search