Browse
···
Log in / Register

Informatica Engineer - Fort Worth, TX (locals only)

Negotiable Salary

iSoftTek Solutions Inc

Fort Worth, TX, USA

Favourites
Share

Description

Job Title: Informatica Engineer Location: Fort Worth, TX (locals only) Mode of Work: Hybrid Yrs of exp: 6+ Any Visa   Description:   Infromatica Engineer POSITION Web/Mobile Developer Job Responsibilities This role is responsible for designing, coding, and modifying ETL processes using Informatica based on mappings from Source to Target applications. Experience in migration to Cloud Developer Job Duties - Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. - Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. - Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). - Performs source system analysis as required. - Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. - Implements versioning of the ETL repository and supporting code as necessary. - Develops stored procedures, database triggers and SQL queries where needed. - Implements best practices and tunes SQL code for optimization. - Loads data from SF Power Exchange to MongoDB database using Informatica. - Works with XML's, XML parser, Java and HTTP transformation within Informatica. - Works with Informatica Data Quality (Analyst and Developer) - Primary skill is Informatica PowerCenter - 5-7 years of experience. Developer Skills and Qualifications Informatica PowerCenter, Informatica Powerxchange. MongoDB, IMS , Relational Database Education/Experience • Bachelor’s Degree in computer science or equivalent training required • 5+ years’ experience required

Source:  workable View original post

Location
Fort Worth, TX, USA
Show map

workable

You may also like

Craigslist
Prepress Technician/Mac Operator (Denver)
Job Summary: Highly motivated individual with a minimum of 3 years’ experience in a high-volume prepress environment for Web Offset and Digital presses. Essential Job Functions: • Preflight/edit/troubleshoot customer files to ensure they conform to a job’s specification. • Process/trap files in Kodak Prinergy. • Provide detailed analysis of potential problems and/or showstoppers to Account Management. • Impose files for proof and/or press using Kodak Preps. • Output (RIP) files for correct proof and/or plate manufacturing. • Perform quality assurance throughout the prepress production cycle. • Performs other duties and special tasks when assigned. Qualifications: • Understanding of current Prepress and Print Industry standards and expectations. • Knowledge of Mac and applied software • Fundamental knowledge of ripping and trapping of files. • Demonstrate ability to read and comprehend written instructions. • Excellent interpersonal, problem solving and troubleshooting skills with a keen attention to detail. • Good communication, organizational and time management skills. • Adobe Creative Cloud; Acrobat, InDesign, Illustrator, Photoshop. • Enfocus PitStop Pro • QuarkXPress • Kodak Prinergy Connect (Workshop) • Preps Imposition software Physical Demands: • The physical demands are typical for an office setting. Must be able to lift up to 20 pounds. Apply here: https://recruiting.paylocity.com/recruiting/jobs/Apply/3634285/Publication-Printers-Corporation/Mac-Operator
2001 S Platte River Dr, Denver, CO 80223, USA
$25-30/hour
Workable
Platform Engineer AI Tool & Integration
Platform Engineer - AI Tool & Integration Position Overview: The Data Analytics & AI department at RCG is seeking a highly skilled and experienced Software & Platform Engineer to join our team. This pivotal role requires a strong technical background in AI tooling, data platform architecture, cloud computing, and big data technologies. The successful candidate will be responsible for all tooling and integration with GenAI LLM, maintaining our Azure platform with OpenAI, and leveraging Databricks Mosaic AI. This role will be instrumental in driving innovation, ensuring seamless integration, and optimizing our AI and data platforms to meet the evolving needs of the business. Key Responsibilities: • Design, develop, and maintain tooling and integration for GenAI LLM and other AI models. • Manage and optimize our Azure platform, ensuring seamless integration with OpenAI and Databricks Mosaic AI. • Collaborate with cross-functional teams to identify and implement innovative AI solutions to enhance our platform capabilities. • Stay up to date with the latest advancements in AI and machine learning technologies to drive continuous improvement and innovation. • Develop and implement best practices for AI model deployment & scaling. • Design and execute integration strategies for incorporating large language models (LLMs) and CoPilot technologies with existing business platforms. • Assess and recommend suitable LLM and CoPilot technologies that align with business needs and technical requirements. • Conduct feasibility studies and proof-of-concepts to validate the integration of new tools and technologies. • Keep abreast of the latest advancements in LLM, CoPilot, and related technologies to identify opportunities for further innovation and improvement. • Understand and leverage MS CoPilot and MS CoPilot Studio for enhanced productivity and collaboration within the development team. • Integrate MS CoPilot tools into existing workflows and ensure seamless integration with other systems and applications used by the team. • Work closely with product managers, data scientists, and other stakeholders to gather requirements and ensure successful integration of LLM and CoPilot technologies. Requirements: • Bachelor’s or Master’s degree in computer science, Engineering, or a related field. • Proven experience in software/system/platform engineering, with a focus on AI tooling and integration. • Strong expertise in working with Azure, including managing and integrating AI services such as OpenAI and Databricks Mosaic AI. • Proficiency in programming languages such as Python, Java, or C++. • Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch). • Solid understanding of software development methodologies, including Agile and DevOps practices and tools for continuous integration and deployment. • Understanding of data security best practices and compliance requirements in software development and integration • Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. • Strong communication and collaboration skills. Preferred Qualification: • Strong proficiency in one or more programming languages such as Python, Java, C#, or JavaScript. • Experience with large language models (LLMs) and familiarity with tools like OpenAI GPT, Google BERT, or similar. • Hands-on experience with Databricks for data engineering, data science, or machine learning workflows. • Proficiency in using Databricks for building and managing data pipelines, ETL processes, and real-time data processing. • Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch). • Hands-on experience with Microsoft CoPilot and CoPilot Studio. • Proficiency in working with APIs, microservices architecture, and web services (REST/SOAP). • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, and their integration services. • Knowledge of database systems, both SQL and NoSQL (e.g., MySQL, MongoDB). • Knowledge of natural language processing (NLP) and large language models (LLMs). • Previous experience in a similar role within a technology-driven organization. • Certifications in Azure, AI, or related areas.
Coral Gables, FL, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.