About Racedog Technologies

Racedog Technologies is a staffing and recruiting company.

company website →

Data Architect← All Jobs

2019-04-19 | McLean,VA | DOE | 12 Months+ Contract



Preference: local should be fine or near by states

Basically we some one with good exp in Bigdata and data modeling

Position Summary
As a Team Member in this , you will be responsible for:

· Creating and maintaining optimal data pipeline architecture.

· Assemble large, complex data sets that meet functional/non-functional business requirements.

· Identify , design, and implement internal process improvements:automating manual processes. Optimizing data delivery, re-designing code and infrastructure for greater scalability.

· Build re-usable data pipelines to ingest, standardize, shape data from various zones in Hadoop data lake.

· Build analytic tools that utilize datapipeline to provide actionable insights into customer acquisition, revenue management, Digital and marketing areas for operational efficiency and KPI’s.

· Design and build BI API’s on established enterprise Architecture patterns , for data sharing from various sources.

· Design and integrate data using big data tools – Spark, Scala , Hive etc.

· Helping manage the library of all deployed Application Programming Interface (API)s

· Supporting API documentation of classes, methods scenarios, code, design rationales, and contracts.

· Design/build , maintain small set of highly flexible and scalable models linked to client’s specific business s.

Required Qualifications:

· Minimum Education: Associate's Degree/College Diploma/Cepeg in computer science, information systems or other quantitative field.

· Minimum Years of Experience: 5+ years experience in data engineering /data integration.

· Minimum Years of Experience: five (5) years

· Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

· Experience building and optimizing ‘big data’ data pipelines, architecture and data sets.

· Build programs/processes supporting data transformation, data structures, metadata, dependency and workload management.

· Experience in Data warehouse , Data Mart ETL implementations using big data technologies.

· Working knowledge of message queuing , stream processing and scalable data stores.

· Experience with relational SQL and NoSQL databases, Graph databases (preferred).

· Strong experience with object oriented programming – Java, C++ , Scala (preferred)

· Experience with AWS cloud services: strom, spark-streaming etc.

· Experience with API, web services design and development

Preferred Qualifications:

· Functional experience in hospitality

· End-to-end experience in building data flow process (from ingestion to consumption layer)

· Solid working experience with surrounding and supporting disciplines (data quality, data operations, BI/Data WH/Data Lake)

· Effective communicator, collaborator, influencer, solution seeker across variety of opinions

· Self-starter, well organized, extremely detail-oriented and an assertive team player, willing to take ownership of responsibilities, and possess a high level of positive energy and drive.

· Excellent time management and organizational skills

· Ability to manage multiple priorities, work well under pressure and effectively handle concurrent demands to prioritize responsibilities


Your Name here
Your Email Address
Enter your message to the company explaining why you are a fit for this job
Please use Microsoft Word format

Similar Jobs: