Browse
···
Log in / Register

C++ Market Data Engineer

Negotiable Salary

Trexquant Investment

Stamford, CT, USA

Favourites
Share

Description

Trexquant is a growing systematic fund at the forefront of quantitative finance, with a core team of highly accomplished researchers and engineers. To keep pace with our expanding global trading operations, we are seeking a C++ Market Data Engineer to design and build ultra-low-latency feed handlers for premier vendor feeds and major exchange multicast feeds. This is a high-impact role that sits at the heart of Trexquant's trading platform; the quality, speed, and reliability of your code directly influence every strategy we run. Responsibilities Design & implement high-performance feed handlers in modern C++ for equities, futures, and options across global venues (e.g., NYSE, CME, Refinitiv RTS, Bloomberg B-PIPE). Optimize for micro- and nanosecond latency using lock-free data structures, cache-friendly memory layouts, and kernel-bypass networking where appropriate. Build reusable libraries for message decoding, normalization, and publication to internal buses shared by research, simulation, and live trading systems. Collaborate with cross-functional teams to tune TCP/UDP multicast stacks, kernel parameters, and NIC settings for deterministic performance. Provide robust failover, gap-recovery, and replay mechanisms to guarantee data integrity under packet loss or venue outages. Instrument code paths with precision timestamping and performance metrics; drive continuous latency regression testing and capacity planning. Partner closely with quantitative researchers to understand downstream data requirements and to fine-tune delivery formats for both simulation and live trading. Produce clear architecture documents, operational run-books, and post-mortems; participate in a 24×7 follow-the-sun support rotation for mission-critical market-data services. Requirements BS/MS/PhD in Computer Science, Electrical Engineering, or related field. 3+ years of professional C++ (14,17,20) development experience focused on low-latency, high-throughput systems. Proven track record building or maintaining real-time market-data feeds (e.g., Refinitiv RTS/TREP, Bloomberg B-PIPE, OPRA, CME MDP, ITCH). Strong grasp of concurrency, lock-free algorithms, memory-model semantics, and compiler optimizations. Familiarity with serialization formats (FAST, SBE, Protocol Buffers) and time-series databases or in-memory caches. Comfort with scripting in Python for prototyping, testing, and ops automation. Excellent problem-solving skills, ownership mindset, and ability to thrive in a fast-paced trading environment. Familiarity with containerization (Docker/K8s) and public-cloud networking (AWS, GCP). Benefits Competitive salary, plus bonus based on individual and company performance. Collaborative, casual, and friendly work environment while solving the hardest problems in the financial markets. PPO Health, dental and vision insurance premiums fully covered for you and your dependents. Pre-Tax Commuter Benefits  Trexquant is an Equal Opportunity Employer

Source:  workable View original post

Location
Stamford, CT, USA
Show map

workable

You may also like

Workable
Data & BI Senior Data Engineer
Job Description: We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, with a specialization in Matillion, SSIS, Azure DevOps, and ETL processes. This role will involve designing, developing, testing, and deploying ETL jobs, collaborating with cross-functional teams, and ensuring efficient data processing. Key Responsibilities: Design, develop, test, and deploy Matillion ETL jobs in accordance with project requirements. Collaborate with the Data and BI team to understand data integration needs and translate them into Matillion ETL solutions. Create and modify Python code/components in Matillion jobs. Identify opportunities for performance optimization and implement enhancements to ensure efficient data processing. Collaborate with cross-functional teams, including database administrators, data engineers, and business analysts, to ensure seamless integration of ETL processes. Create and maintain comprehensive documentation for Matillion ETL jobs, ensuring knowledge transfer within the team. Create, test, and deploy SQL Server Integration Service (SSIS) packages and schedule them via Active Batch scheduling tool. Create Matillion deployment builds using Azure DevOps CI/CD pipeline and perform release manager activities. Review code of other developers (L2, L3-BI/DI) to ensure code standards and provide approval as part of code review activities. Resolve escalation tickets from the L2 team as part of the on-call schedule. Working knowledge of API and Postman tool is an added advantage. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a focus on ETL processes. Proficiency in Matillion, SSIS, Azure DevOps, and ETL. Strong knowledge of SQL, Python, and data integration techniques. Experience with performance optimization and data processing enhancements. Excellent collaboration and communication skills. Ability to work in a fast-paced, dynamic environment. Preferred Skills: Experience with cloud platforms such as AWS or Azure. Knowledge of data warehousing and data modeling. Familiarity with DevOps practices and CI/CD pipelines. Strong problem-solving skills and attention to detail.
Atlanta, GA, USA
Negotiable Salary
Craigslist
Earn $$ as a Sales Rep
Field Sales Representative Overview We are seeking a motivated and customer-focused Field Sales Representative to join our dynamic team as we change the distribution industry. We're creating a unified platform that connects brands, sales reps, and retailers, bringing intelligence and efficiency to an industry ready for transformation. The ideal candidate will possess a passion for sales, exciting new consumer brands, and helping customers grow their businesses. As a RepRally Field Sales Representative, you will play a crucial role in building relationships with retailers, identifying the products that are the best fit for their stores, and teaching them the benefits of working with our platform. Responsibilities Build your book of business with independent retail stores as the face of RepRally in the field. Introduce new accounts to our platform and grow relationships with existing accounts. Advise customers on our current catalogue, including emerging brands and our lucrative store incentives. Stay up to date on the latest brands and how to pitch them with our regular brand trainings, led by the brands themselves. Secure sales orders from your customers through the RepRally app to hit your targets. Work with your leadership team on account planning, meeting review, and tactics to close the sale to ramp your order volume. Address customer inquiries live and with our operations team to resolve issues promptly and with professionalism to drive a compelling customer experience. Skills Self-starter mindset that will allow you to quickly get up to speed on the RepRally app and drive to open and grow accounts. Willingness to go in person to meet customers and make sales Excellent phone etiquette for handling customer inquiries via phone when necessary. Multiple languages is a plus! Why Join RepRally Flexible schedule — Manage your hours and sell to accounts near you Uncapped commissions Fast-moving company that is constantly innovating and improving Focus on results
993 E 42nd St, Los Angeles, CA 90011, USA
$1,000/biweek
Workable
Sr. Analytics Developer
TheGeorgia Department of Human Services (DHS) is seeking qualified candidates forthe senior analytics Developer positionin Atlanta, Georgia. The desired candidate will design, develop, test, maintain, andsupport complex data extract, transformation, and load (ETL) programs for anEnterprise Data Warehouse. Understanding how complex data should be transformedfrom the source and loaded into the data warehouse is critical to this job. ·Expertiseand solid experience in BI Tools – Power BI and BigQuery ·StrongIDMC(Informatica) technical knowledge in the design, development, andmanagement of complex Informatica mappings, sessions, and workflows ·Strongprogramming skills, relational database skills with expertise in Advanced SQLand PL/SQL, indexing, and query tuning ·Experiencedin Business Intelligence and Data warehousing concepts and methodologies. Contribute to the design and implementation of the solution's metadata, ETL, and reporting components, including testing and, at times, deployment. ·Extensiveexperience in data and root cause analysis and proven problem-solving andanalytical thinking capabilities. ·Analyticalcapabilities to slice and dice data and display data in reports for the bestuser experience. ·Demonstratedability to review and translate business processes into BI reporting andanalysis solutions. ·Abilityto follow the Software Development Lifecycle (SDLC) process and should be ableto work under any project management methodologies used. ·Abilityto follow best practices and standards. ·Abilityto identify and tune BI application performance bottlenecks. ·Abilityto work quickly and accurately under pressure and project time constraints ·Abilityto prioritize workload and work with minimal supervision ·Basicunderstanding of software engineering principles and skills working onUnix/Linux/Windows Operating systems, Version Control, and Office software ·ExposureData Modeling using Star/Snowflake Schema Design, Data Marts, Relational andDimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensionaltables, Physical and Logical data modeling Produce client deliverables such as detailed design documentation, configuration documentation, unit test plans, and training curriculum Required Qualifications: A bachelor's degree in Computer Science or a related field 6 to 7 years of experience working with Power BI, IDMC,BigQuery and Python Ability to design and develop complex Informatica mappings, sessions, and workflows and identify areas of optimization Experience with Oracle RDBMS 19c Effective communication skills (both oral and written) and the ability to work effectively in a team environment are required. Proven ability and desire to mentor/coach others in a team environment Strong analytical, problem-solving, and presentation skills. Preferred Qualifications: Working knowledge of Informatica Power Exchange/Power Center and IDMC Experience with relational, multidimensional, and OLAP techniques and technology Experience with Visualization tools like MS Power BI, Tableau Experience with BigQuery and Python. Soft Skills: ·        Strong written and oral communication skills in the English Language. ·        Ability to work with Businesses and communicate technical solutions to solve business problems.  
Atlanta, GA, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.