Browse
···
Log in / Register

TIBCO Developer

Negotiable Salary

Axiom Software Solutions Limited

Jackson, MS, USA

Favourites
Share

Description

Role: TIBCO Developer Location: Jackson MI -  Hybrid mode Position Type: Contract   We are seeking a TIBCO Developer to join the integration team supporting Consumers Energy in Jackson, MI. The ideal candidate will have strong expertise in TIBCO Business  Works 6.x and a solid understanding of enterprise integration patterns.   Responsibilities:   ·       Design, develop, and support integration solutions using TIBCO BusinessWorks 6.x and 5.x, TIBCO EMS, API development and TIBCO BusinessEvents. ·       Develop and maintain APIs using TIBCO API Exchange, including RESTful and SOAP web services ·       Configure and monitor messaging using TIBCO EMS, Hawk, and Rendezvous ·       Collaborate with business analysts, architects, and other developers to deliver high-quality integration solutions ·       Troubleshoot and resolve production issues in a timely manner ·       Participate in code reviews and deployment processes ·       Ensure integration components adhere to security, performance, and scalability standards   Required Qualifications:   ·       5+ years of hands-on experience with: ·       TIBCO BusinessWorks (BW) 6.x and 5.x ·       TIBCO EMS, Hawk, BusinessEvents ·       TIBCO Rendezvous and API Exchange Gateway ·       Solid understanding of SOA and integration architecture ·       Experience with RESTful and SOAP APIs ·       Familiarity with DevOps, CI/CD pipelines, and Git-based versioning ·       Strong debugging and troubleshooting skills in enterprise environments ·       Ability to work independently in a hybrid/remote setup ·       Strong verbal and written communication skills   Nice to Have:   ·       Knowledge of Containers (Docker/Kubernetes) and Cloud platforms (Azure or AWS) ·       Experience with other integration technologies or scripting languages ·       Prior experience working in utility or energy sectors   Note: they need BW and BE development skills, not just one or the other, can good comminication skills, clear english, I am also asking if they are willing to relocate to Jackson Mi, along with API development  

Source:  workable View original post

Location
Jackson, MS, USA
Show map

workable

You may also like

Workable
Low-Latency Developer
Atto Trading, a dynamic quantitative trading firm founded in 2010 and leading in global high-frequency strategies, is looking for a Low-Latency Developer to join our team. We are expanding an international, diverse team, with experts in trading, statistics, engineering, and technology. Our disciplined approach combined with rapid market feedback allows us to quickly turn ideas into profit. Our environment of learning and collaboration allows us to solve some of the world’s hardest problems, together. As a small firm, we remain nimble and hold ourselves to the highest standards of integrity, ingenuity, and effort.  Position Highlights: We are modernizing our trading and research platform to scale our alpha trading business. This platform will enable researchers to explore, test, and deploy sophisticated signals, models, and strategies across asset classes in a robust, fully automated manner while maintaining competitive latency targets. As a Low-Latency Developer, you will be responsible for designing, optimizing, and maintaining high-performance trading systems to minimize latency. Your Mission and Goals: Analyze and optimize the performance of low-latency trading systems by identifying bottlenecks and inefficiencies in the code, and implementing effective solutions.  Develop and adapt the platform to support the demands of a fast-paced trading environment, while effectively managing technical debt. Requirements Over 5 years of experience as a low-latency developer with a focus on performance optimization in a high-frequency trading (HFT) environment. Experience with multiple components of an HFT platform or system, particularly those on the critical path. Experience working at an HFT firm during its startup phase and/or on a trading team is a significant plus. Technical Skills: Deep knowledge of HFT platforms: networking, kernel bypass, market data, order entry, threading, inter-process communication, and strategy APIs. Proven low-latency development and performance optimization in HFT. Strong proficiency in C++. Excellent understanding of CPU caches and cache efficiency. Experience with multithreaded and multi-process synchronization. Good understanding of networking protocols. Skilled in performance profiling and optimization tools. Advanced knowledge of Linux operating systems, including kernel-level device mechanisms. About You: Practical decision-making skills. Excellent communication skills. Strong analytical and problem-solving skills. Passion for trading. Ability to work independently and as part of a team. Benefits Competitive rates of pay. Paid time off (5 weeks). Coverage of health insurance costs. Office lunches. Discretionary bonus system.  Annual base salary range of $150,000 to $300,000. Pay (base and bonus) may vary depending on job-related skills and experience. Our motivation: We are a company committed to staying at the forefront of technology. Our team is passionate about continual learning and improvement. With no external investors or customers, we are the primary users of the products we create, giving you the opportunity to make a real impact on our company's growth. Ready to advance your career? Join our innovative team and help shape the future of trading on a global scale. Apply now and let's create the future together!
New York, NY, USA
$150,000-300,000/year
Workable
Senior Big Data Engineer
ABOUT US: Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world’s top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people’s lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint. We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology.  Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle.  KEY RESPONSIBILITIES Develop and maintain the Big Data Platform by performing data cleansing, data warehouse modeling, and report development on large datasets. Collaborate with cross-functional teams to provide actionable insights for decision-making. Manage the operation and administration of the Big Data Platform, including system deployment, task scheduling, proactive monitoring, and alerting to ensure stability and security. Handle data collection and integration tasks, including ETL development, data de-identification, and managing data security. Provide support for other departments by processing data, writing queries, developing solutions, performing statistical analysis, and generating reports. Troubleshoot and resolve critical issues, conduct fault diagnosis, and optimize system performance. Requirements REQUIRED QUALIFICATIONS Bachelor’s degree or higher in Computer Science or a related field, with at least three years of experience maintaining a Big Data platform. Strong understanding of Big Data technologies such as Hadoop, Flink, Spark, Hive, HBase, and Airflow, with proven expertise in Big Data development and performance optimization. Familiarity with Big Data OLAP tools like Kylin, Impala, and ClickHouse, as well as experience in data warehouse design, data modeling, and report generation. Proficiency in Linux development environments and Python programming. Excellent communication, collaboration, and teamwork skills, with a proactive attitude and a strong sense of responsibility. PREFERRED QUALIFICAITONS Experience with cloud-based deployments, particularly AWS EMR, with familiarity in other cloud platforms being a plus. Proficiency in additional languages such as Java or Scala is a plus. Benefits Salary Range: $150,000 - $180,000 Free snacks and drinks, and provided lunch on Fridays Fully paid medical, dental, and vision insurance (partial coverage for dependents) Contributions to 401k funds Bi-annual reviews, and annual pay increases Health and wellness benefits, including free gym membership Quarterly team-building events At TP-Link Systems Inc., we are continually searching for ambitious individuals who are passionate about their work. We believe that diversity fuels innovation, collaboration, and drives our entrepreneurial spirit. As a global company, we highly value diverse perspectives and are committed to cultivating an environment where all voices are heard, respected, and valued. We are dedicated to providing equal employment opportunities to all employees and applicants, and we prohibit discrimination and harassment of any kind based on race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Beyond compliance, we strive to create a supportive and growth-oriented workplace for everyone. If you share our passion and connection to this mission, we welcome you to apply and join us in building a vibrant and inclusive team at TP-Link Systems Inc. Please, no third-party agency inquiries, and we are unable to offer visa sponsorships at this time.
Irvine, CA, USA
$150,000-180,000/year
Craigslist
Part-Time Laboratory Support/Admin (San Diego)
Part-Time Laboratory Support/Admin – San Diego, CA We are seeking a Part-Time Laboratory Support/Admin to join our team in San Diego. This role is designed to provide support to our full-time laboratory staff, helping ensure smooth and efficient day-to-day operations. Schedule Monday through Friday 8:30 AM – 1:00 PM On-site at our San Diego laboratory Responsibilities In this position, you will work closely with our full-time laboratory team member, assisting with daily tasks such as: Helping log and track receipt of daily projects Assisting with communication to clients regarding questions or errors on submitted projects Supporting inventory management and supply tracking Performing data entry for project data and quality control (QC) data Helping review QC reports and confirmations This is a support role, where you’ll play an important part in keeping our operations organized and running smoothly, while learning about lab workflows. Qualifications Strong attention to detail and organizational skills Excellent communication skills (written and verbal) Ability to work collaboratively and follow direction Comfortable managing multiple tasks in a busy environment Previous lab or administrative experience is helpful but not required Why Join Us? This is a great opportunity to gain hands-on experience in laboratory operations while supporting an experienced full-time team member. You’ll contribute to meaningful projects in a professional, growth-oriented environment.
7902 Convoy Ct, San Diego, CA 92111, USA
$20/hour
Workable
Data & BI Senior Data Engineer
Job Description: We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, with a specialization in Matillion, SSIS, Azure DevOps, and ETL processes. This role will involve designing, developing, testing, and deploying ETL jobs, collaborating with cross-functional teams, and ensuring efficient data processing. Key Responsibilities: Design, develop, test, and deploy Matillion ETL jobs in accordance with project requirements. Collaborate with the Data and BI team to understand data integration needs and translate them into Matillion ETL solutions. Create and modify Python code/components in Matillion jobs. Identify opportunities for performance optimization and implement enhancements to ensure efficient data processing. Collaborate with cross-functional teams, including database administrators, data engineers, and business analysts, to ensure seamless integration of ETL processes. Create and maintain comprehensive documentation for Matillion ETL jobs, ensuring knowledge transfer within the team. Create, test, and deploy SQL Server Integration Service (SSIS) packages and schedule them via Active Batch scheduling tool. Create Matillion deployment builds using Azure DevOps CI/CD pipeline and perform release manager activities. Review code of other developers (L2, L3-BI/DI) to ensure code standards and provide approval as part of code review activities. Resolve escalation tickets from the L2 team as part of the on-call schedule. Working knowledge of API and Postman tool is an added advantage. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a focus on ETL processes. Proficiency in Matillion, SSIS, Azure DevOps, and ETL. Strong knowledge of SQL, Python, and data integration techniques. Experience with performance optimization and data processing enhancements. Excellent collaboration and communication skills. Ability to work in a fast-paced, dynamic environment. Preferred Skills: Experience with cloud platforms such as AWS or Azure. Knowledge of data warehousing and data modeling. Familiarity with DevOps practices and CI/CD pipelines. Strong problem-solving skills and attention to detail.
Atlanta, GA, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.