Browse
···
Log in / Register

Senior Software Developer (Gateway/Market Data)

$150,000/year

Eagle Seven

Chicago, IL, USA

Favourites
Share

Description

Eagle Seven is seeking a Senior Software Developer focused on exchange connectivity and market data.  The individual will be responsible for analyzing exchange protocols, proposing design solutions, and implementing connectivity to trading venues across the world. The role be a part of the platform development team and will provide the individual with exposure to traders and strategy developers. The successful candidate will be a self-starter, have strong sense of ownership and be driven to provide technical and intellectual solutions to business problems.    Primary Responsibilities include: Architecting and implementing low-latency market access solutions Understanding, interpreting, and interfacing with global exchanges and their protocols Designing, developing, and supporting market data feed handlers and exchange order routers Diagnosing latency issues and resolving with appropriate tuning and optimizations Working with traders to source, evaluate and facilitate access to new data sources Working with extended team to capture, house, and provide historical access to market data Liaise with vendors on data and technical issues as needed to deliver rapid solutions to the business Requirements Skills and Experience: Bachelor’s degree in Computer Science or related field Proven track record of understanding and working with global exchange protocols Experience with writing parsers for exchange protocols such as FIX and ITCH, etc. Strong background in C++ and C++ Template metaprogramming with demonstrated experience using C++14/C++20 Expertise with TCP/IP, UDP multicast, sockets, network protocols, particularly on Linux/Unix systems Experience using network tools such as Wireshark and TCPDump to monitor and debug behavior Ability to work in a collaborative environment Excellent written and verbal communication skills Benefits Eagle Seven offers a competitive and comprehensive benefits package to all full-time employees. Medical PPO and HMO coverage through BlueCross BlueShield Company Contributions to a Health Savings Account (with enrollment into a High Deductible Health Plan) Dental coverage through Principal Vision coverage through VSP 401k Retirement Savings Plan with Employer Match Company Paid Life Insurance Company Paid Disability Insurance Paid Time Off Flexible Spending Account Pre-tax Transit Benefits Complimentary Lunch and Beverages The minimum base salary for this role starts at $150,000. This role is eligible for a discretionary performance bonus as part of the total compensation package, in addition to the benefits listed above. Exact compensation offered may vary based on factors including, but not limited to, the candidate's experience, qualifications, and skill set.

Source:  workable View original post

Location
Chicago, IL, USA
Show map

workable

You may also like

Workable
Senior Big Data Engineer
ABOUT US: Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint. We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology.  Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle.  KEY RESPONSIBILITIES Develop and maintain the Big Data Platform by performing data cleansing, data warehouse modeling, and report development on large datasets. Collaborate with cross-functional teams to provide actionable insights for decision-making. Manage the operation and administration of the Big Data Platform, including system deployment, task scheduling, proactive monitoring, and alerting to ensure stability and security. Handle data collection and integration tasks, including ETL development, data de-identification, and managing data security. Provide support for other departments by processing data, writing queries, developing solutions, performing statistical analysis, and generating reports. Troubleshoot and resolve critical issues, conduct fault diagnosis, and optimize system performance. Requirements REQUIRED QUALIFICATIONS Bachelor’s degree or higher in Computer Science or a related field, with at least three years of experience maintaining a Big Data platform. Strong understanding of Big Data technologies such as Hadoop, Flink, Spark, Hive, HBase, and Airflow, with proven expertise in Big Data development and performance optimization. Familiarity with Big Data OLAP tools like Kylin, Impala, and ClickHouse, as well as experience in data warehouse design, data modeling, and report generation. Proficiency in Linux development environments and Python programming. Excellent communication, collaboration, and teamwork skills, with a proactive attitude and a strong sense of responsibility. PREFERRED QUALIFICAITONS Experience with cloud-based deployments, particularly AWS EMR, with familiarity in other cloud platforms being a plus. Proficiency in additional languages such as Java or Scala is a plus. Benefits Salary Range: $150,000 - $180,000 Free snacks and drinks, and provided lunch on Fridays Fully paid medical, dental, and vision insurance (partial coverage for dependents) Contributions to 401k funds Bi-annual reviews, and annual pay increases Health and wellness benefits, including free gym membership Quarterly team-building events At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc. Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.
Irvine, CA, USA
$150,000-180,000/year
Workable
Data & BI Senior Data Engineer
Job Description: We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, with a specialization in Matillion, SSIS, Azure DevOps, and ETL processes. This role will involve designing, developing, testing, and deploying ETL jobs, collaborating with cross-functional teams, and ensuring efficient data processing. Key Responsibilities: Design, develop, test, and deploy Matillion ETL jobs in accordance with project requirements. Collaborate with the Data and BI team to understand data integration needs and translate them into Matillion ETL solutions. Create and modify Python code/components in Matillion jobs. Identify opportunities for performance optimization and implement enhancements to ensure efficient data processing. Collaborate with cross-functional teams, including database administrators, data engineers, and business analysts, to ensure seamless integration of ETL processes. Create and maintain comprehensive documentation for Matillion ETL jobs, ensuring knowledge transfer within the team. Create, test, and deploy SQL Server Integration Service (SSIS) packages and schedule them via Active Batch scheduling tool. Create Matillion deployment builds using Azure DevOps CI/CD pipeline and perform release manager activities. Review code of other developers (L2, L3-BI/DI) to ensure code standards and provide approval as part of code review activities. Resolve escalation tickets from the L2 team as part of the on-call schedule. Working knowledge of API and Postman tool is an added advantage. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a focus on ETL processes. Proficiency in Matillion, SSIS, Azure DevOps, and ETL. Strong knowledge of SQL, Python, and data integration techniques. Experience with performance optimization and data processing enhancements. Excellent collaboration and communication skills. Ability to work in a fast-paced, dynamic environment. Preferred Skills: Experience with cloud platforms such as AWS or Azure. Knowledge of data warehousing and data modeling. Familiarity with DevOps practices and CI/CD pipelines. Strong problem-solving skills and attention to detail.
Atlanta, GA, USA
Negotiable Salary
Workable
Software Dev Engineer IV
Job Title: Software Dev Engineer IV Location: Herndon, VA, 20171 Duration: 6 Months       Job Type: Contract         Work Type: Hybrid   Job Description:  Design, develop, implement, test, document and deploy full-stack, cloud-native, contact center-related software applications, tools, systems and services using multi-threaded programming, development in Python and React/node.js, implementing architecture patterns and design patterns, and utilizing generative AI large language models. Assist in gathering and analyzing business and functional requirements, and translate requirements into technical specifications for robust, scalable, supportable solutions that work well within the overall system architecture. Own delivery of entire piece of system or application, and serve as technical lead on complex projects using best practice engineering standards. Produce comprehensive, usable software documentation. Qualifications: MS or BS in Computer Science, Computer or Electrical Engineering, Mathematics, or a related field, plus five years of progressively responsible experience in the job offered or related occupations of Software Engineer, Software Developer, or related. Required technical skills: Coding proficiency in Python, and front-end development experience with Javascript/React. Proficiency development with services such as AWS Lambda, Step Functions, DynamoDB, AppSync, Bedrock, SageMaker, and CloudWatch. Proficiency in developing and integrating with REST-based or GraphQL-based APIs. Proficiency in developing infrastructure-as-code deployment solutions such as AWS CloudFormation or AWS CDK . Experience collaborating with other developers using git repositories, including creating and managing feature branches, pull requests, code merge, and GitHib actions or equivalent. Preferred skills: Experience with Contact Center development and telephony infrastructure. Experience with prompt engineering for modern large language models. Experience using modern AI-based agentic coding assistants for code development, test development, and documentation. Track record of building successful serverless architectures following AWS Well Architected principles. Candidate Requirements:  Years of Experience: 5+ Years Degree or Certification: Bachelors’ degree preferred Top 3 must-have hard skills:  Generative AI based coding AWS serverless Python and JavaScript/React Required: 5+ years-* Python Developer,  Javascript/React, AWS, (Gen AI/ AI / ML/ MLops / etc..,) Only Hybrid 
Herndon, VA 20170, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.