Apply Now
Job Opportunities
Location: Ashburn VA, and Various Client locations throughout the U.S
Duration: Longterm
Job Description :
- Epic Beacon certified specialist experienced in working with Clinical operations teams and end users within organization and other outside parties as necessary to build and validate protocols, configure and troubleshoot operational workflows and other technical issues
- This person is responsible for configuring, testing and deploying the Epic software according to decisions made by operational and technical SMEs
- Participate in initiatives by gathering and analyzing information, preparing structured documentation, presenting findings, and generally troubleshooting as necessary
Qualifications:
Job title:Epic Technical Lead – DBA, Bridges, Data Courier Sr. Analyst
Location: Ashburn VA, and Various Client locations throughout the U.S
Duration: Long term
Job Description :
- The Applications Analyst is responsible for coordinating all issues that arise during the project for their application areas and must be very knowledgeable about the organization's policies, procedures, and business operations.
- The Applications Analyst is responsible for configuring, testing, implementing and supporting Epic for the delivery of patient care and other supporting functions of the organization.
- Guiding workflow design, building and testing the system, and analyzing other technical issues associated with Epic software.
- Works with Epic representatives, the organization's business community, and end users to ensure the system meets the organization's business needs with respect to the project deliverables and timeline.
Qualifications:
Job title: Hadoop Developer/Tester
Location: Tempe, AZ
Duration: Long term
Job Description :
- Experience in validating the different source file formats to standard messaging format (ex. JSON). Understanding of Kafka streaming process and spark framework.
- Experience in Validating Hive database tables using mapping/transformation logic with source files (ex. Avro, parquet, etc..) or staging tables.
- Experience in Validating HBase database tables or similar column-oriented storage/NoSQL databases.
- Experience in writing Python/Scala or other scripts to automate test execution.
- Tokenization or Data masking knowledge and testing the same.
- Test data setup/mockup knowledge in file systems and database tables.
- Knowledge in analyzing the log files, error files for any data ingestion failures.
- Knowledge or hands-on experience in Nifi, Kafka, Atlas, Schema Registry, Ranger, Oozie, GitLab, CI/CD, PyCharm, HDFS
- Programming language knowledge in Python/Scala/ JAVA, HiveQL/ NoSQL/Spark SQL.
Job title: Pega Developer
Location: Oakland, CA
Duration: Longterm
Skills & Experience :
- Pega experience – 8+ years
- Being certified is must
- Experienced with following Pega versions (8.x, 7.x, 6.x)
- Experience with version 8.x is a must
- Experience in implementing SOAP/REST services
- Experienced or having knowledge on upgrades (preferably to version 8.x)
- Knowledge on OOTB
- Guardrails
- Code Optimizations
- Code reviews
- Offshore coordination
- Performance Improvement
- Agents
- Experienced on Databases (DB2, Oracle)
- Reporting – BIX (Optional)
- Service Tools – HPSD, Service Now
- Project Methodology – Agile & Waterfall
- Scripting - Java
Job title:Certified Scrum Master
Location: Denver, CO
Duration: Long Term
Job Description :
Basic Qualifications :
- CSM Certification (Certified Scrum Master)
- At least 3 years of Agile Delivery experience
Preferred Qualifications:
- Bachelor’s Degree
- 4+ years of Agile Delivery experience
- Strong drive to complete when faced with ambiguity
- Ability to energize a group of engineers to achieve their full potential
Excellent communication and partnership skills to effectively manage, inform and influence outcomes.
Job title:Data Engineer
Location: San Jose CA
Duration: 12 Months
Job Description :
- Profile should have more than 8years of experience
- Hands on experience in designing and executing projects on Google Cloud Platform features like App Engine, Compute, storage, Big Query, Data Proc, Data Flow.
- Strong Programming Skills in R, Python or Spark.
- Strong Knowledge on Data Engineering, Simulation and Modelling concepts
- Proficiency in handling the billions of structured or unstructured transactional data.
- Proficiency in modeling techniques such linear regression, logistic regression, GLM
- Knowledge on machine learning techniques such as Decision Trees, xgboost, random forest, PCA etc.
- Knowledge on unsupervised Machine learning techniques such as Clustering, Segmentation
- Strong knowledge on Data Manipulation and transformation
- Knowledge on data loading to GCP services like big query, cloud storage.
- Knowledge in Hadoop, HIVE and Pig languages.
- Good communication skills.