Browse
···
Log in / Register

Data base coordination (mountain view)

Negotiable Salary

3980 Ventura Ct, Palo Alto, CA 94306, USA

Favourites
Share

Description

Porsche dealer has a huge inventory of parts. Fun to sort sell and ship to happy customers

Source:  craigslist View original post

Location
3980 Ventura Ct, Palo Alto, CA 94306, USA
Show map

craigslist

You may also like

Craigslist
🧑‍💻 Hands-On AI Development Training and Projects
We invite highly motivated individuals with a strong interest in artificial intelligence to join our innovative team and pursue a fulfilling, long-term career. We’re a respected tech company looking for individuals who are eager to learn and grow. Interested in gaining real-world experience in software and AI development? This remote, flexible program supports part-time or full-time commitment and includes over 600 hours of structured, guided tasks using professional tools. You'll be prepared to pursue a role as an AI Developer. You’ll cover: 🖥️ Computer & Software Fundamentals • Learn how computing systems and networks work • Study algorithms, machine design, and core cybersecurity concepts • Begin coding in Python 💻 Web & App Development • Build websites using HTML, CSS, and JavaScript • Leverage frameworks like Bootstrap and React.js • Use Git and GitHub for version control and collaboration 🧠 AI & Machine Learning • Learn core machine learning, neural networks, and data science • Work with APIs from OpenAI, and tools like TensorFlow and Pandas • Develop features such as chatbots and automation • Train, test, and visualize models • Use Docker and project planning tools 🗄️ Database & Backend Skills • Design databases and write SQL queries • Learn full CRUD operations and backend integration 🧪 Capstone Project • Build a software product from scratch • Show version control, debugging, and technical documentation • Simulate Agile teams using Scrum methods 🧰 Career Prep • Practice coding interview formats • Build your technical résumé • Prepare for real-world job applications in tech No experience needed. Remote applicants welcome. Apply here: https://aitraining.compare
604 Gallatin St NW, Washington, DC 20011, USA
$60,000/year
Workable
Splunk Engineer - Active TS/SCI Required
You will work with an expert team focused on implementing and operating next-generation security solutions for government and commercial clients. You’ll use Splunk and integrate it with other state-of-the-art tools like HBSS, Enterprise Security Manager (ESM), Network Security Manager (NSM), NetFlow, and/or Intrusion Detection Systems (IDS) to monitor, detect, and analyze threats. You'll perform hands-on evaluation, implementation, and operation of leading security Cyber defense tools and technologies and apply in-depth defense strategies for large and complex networks to rapidly identify vulnerabilities and threats, prioritizing response actions, including developing effective countermeasures. You’ll support the risk management and security compliance of specified cyber security tools. You'll apply thought leadership to solving complex security challenges in a highly collaborative and innovative work environment. Requirements 3+ years of experience utilizing Splunk Enterprise Experience with deploying, configuring, and performing functional testing and data validation in a Splunk environment Experience with Splunk performing systems administration, including performing installation, configuration, monitoring system performance and availability, upgrades, and troubleshooting in Windows and Linux Server environments Experience creating custom dashboards, writing queries and generating reports, and setting up alerts and notifications Familiarity with DoD Risk Management Framework Top Secret/SCI clearance with the ability to obtain a Counter-Intelligence polygraph HS diploma or GED and 7+ years of experience with supporting IT projects and activities, Associate’s degree and 5+ years of experience with supporting IT projects and activities, or Bachelor’s degree and 3+ years of experience with supporting IT projects and activities  DoD 8570 IAT Level II Certification, including CCNA-Security, CySA+, GICSP, GSEC, Security+ CE, CND, or SSCP  Ability to obtain a DoD 8570.01-M Cybersecurity Service Provider - Infrastructure Support Certification, including CEH, CySA+, GICSP, SSCP, CHFI, CFR, Cloud+, or CND Certification prior to start date Optional Qualifications: Ability to ingest and parse logs within Splunk Experience with fields abstraction Experience with data modeling using Splunk Experience with workflows and drilldown query Experience administering Splunk in distributed deployments Experience with performing site surveys, data gathering, and research and analysis regarding deploying and implementing security tools Splunk Certified Power User or other advanced Splunk Certification Experience with DevSecOps and Elasticsearch, Logstash & Kibana (ELK) Possession of excellent oral and written communication skills, including using presentation expertise to convey complex ideas to client and internal staff Possession of excellent problem-solving skills Benefits Essential Network Security (ENS) Solutions, LLC is a service-disabled veteran owned, highly regarded IT consulting and management firm. ENS consults for the Department of Defense (DoD) and Intelligence Community (IC) providing innovative solutions in the core competency area of Identity, Credential and Access Management (ICAM), Software Development, Cyber and Network Security, System Engineering, Program/Project Management, IT support, Solutions, and Services that yield enduring results. Our strong technical and management experts have been able to maintain a standard of excellence in their relationships while delivering innovative, scalable and collaborative infrastructure to our clients. Why ENS? Free Platinum-Level Medical/Dental/Vision coverage, 100% paid for by ENS 401k Contribution from Day 1 PTO + 11 Paid Federal Holidays Long & Short Term Disability Insurance Group Term Life Insurance Tuition, Certification & Professional Development Assistance Workers’ Compensation Relocation Assistance
Washington, DC, USA
Negotiable Salary
Workable
Senior Data Engineer - Active Secret Clearance
Location: Washington, DC (Hybrid) Clearance Required: Active Secret Position Type: Full-Time We are seeking a highly skilled Senior Data Engineer to support the U.S. Coast Guard Office of Data & Analytics (ODA). The ideal candidate will bring expertise in Databricks, SQL, ETL pipelines, and cloud-based data frameworks to enable enterprise-scale analytics and decision-making. This role will involve designing, building, and orchestrating data pipelines, collaborating with analysts and developers, and implementing CI/CD practices to ensure scalable, reliable, and secure data solutions. Primary Responsibilities: Build, test, and orchestrate data pipelines in Databricks. Design and optimize data structures, schemas, and ETL processes. Translate business use cases into SQL queries, reports, and dashboards. Manage database objects in development environments without impacting production. Automate workflows using Databricks Workflows or the Jobs API. Integrate code into GitHub/GitLab and support CI/CD practices. Diagnose data gaps and quality issues, and design test cases for validation. Collaborate with developers and analysts to synchronize code and ensure reliability. Requirements Minimum Qualifications: Active Secret clearance (required at time of application). Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 7+ years of professional experience in data engineering. Expert-level proficiency in Databricks, Apache Spark, and SQL. Experience with ETL tools such as Kafka, Airflow, or AWS Glue. Proficiency with CI/CD pipelines and GitHub/GitLab version control. Knowledge of cloud platforms (AWS, Azure, GCP). Excellent problem-solving, debugging, and collaboration skills. Preferred Qualifications: Experience supporting federal or mission-critical analytics programs. Knowledge of data governance and metadata management practices. Familiarity with containerization or orchestration tools (Docker, Kubernetes). Eligibility: Must be legally authorized to work in the United States without employer sponsorship, now or in the future. Active Secret clearance required for this role. Benefits Salary: Competitive, commensurate with experience.
Washington, DC, USA
Negotiable Salary
Workable
Senior Data Scientist - Active Secret Clearance
Location: Washington, DC (Hybrid) Clearance Required: Active Secret Position Type: Full-Time We are seeking a highly skilled Senior Data Scientist to support enterprise data and analytics initiatives for our federal client. The ideal candidate will bring expertise in AI/ML development, advanced analytics, and automation to deliver mission-focused solutions. This role will involve providing technical leadership, developing predictive models, and collaborating with cross-functional teams to ensure analytics capabilities are scalable, secure, and aligned with enterprise objectives. Strong proficiency with Python, AI/ML frameworks, and RPA tools is required. Primary Responsibilities: Provide technical leadership on enterprise analytics and AI/ML solutions. Design, develop, and deploy machine learning models and predictive analytics. Utilize Python, Anaconda, and Jupyter notebooks for modeling and prototyping. Collaborate with Data Engineers to integrate models into scalable data pipelines. Support data governance activities including tagging, metadata management, and secure sharing. Develop RPA solutions to automate business processes. Evaluate and recommend analytics products to ensure mission alignment. Communicate findings through dashboards, data visualizations, and technical reports. Requirements Minimum Qualifications: Active Secret clearance (required at time of application). Bachelor’s or Master’s degree in Data Science, Computer Science, Mathematics, or a related field. 7+ years of professional experience in data science, AI/ML, and advanced analytics. Strong proficiency with Python, Anaconda, and Jupyter notebooks. Expertise in AI/ML frameworks such as TensorFlow, PyTorch, and Scikit-learn. Experience with RPA tools (UiPath, Automation Anywhere, or Python-based automation). Familiarity with Databricks, Apache Spark, and SQL. Excellent communication, documentation, and stakeholder engagement skills. Preferred Qualifications: Experience supporting federal or regulated environments. Knowledge of MLOps practices for deploying and managing ML models at scale. Background in data governance and enterprise metadata management. Eligibility: Must be legally authorized to work in the United States without employer sponsorship, now or in the future. Active Secret clearance required for this role. Benefits Salary: Competitive, commensurate with experience.
Washington, DC, USA
Negotiable Salary
Workable
Platform Engineer AI Tool & Integration
Platform Engineer - AI Tool & Integration Position Overview: The Data Analytics & AI department at RCG is seeking a highly skilled and experienced Software & Platform Engineer to join our team. This pivotal role requires a strong technical background in AI tooling, data platform architecture, cloud computing, and big data technologies. The successful candidate will be responsible for all tooling and integration with GenAI LLM, maintaining our Azure platform with OpenAI, and leveraging Databricks Mosaic AI. This role will be instrumental in driving innovation, ensuring seamless integration, and optimizing our AI and data platforms to meet the evolving needs of the business. Key Responsibilities: • Design, develop, and maintain tooling and integration for GenAI LLM and other AI models. • Manage and optimize our Azure platform, ensuring seamless integration with OpenAI and Databricks Mosaic AI. • Collaborate with cross-functional teams to identify and implement innovative AI solutions to enhance our platform capabilities. • Stay up to date with the latest advancements in AI and machine learning technologies to drive continuous improvement and innovation. • Develop and implement best practices for AI model deployment & scaling. • Design and execute integration strategies for incorporating large language models (LLMs) and CoPilot technologies with existing business platforms. • Assess and recommend suitable LLM and CoPilot technologies that align with business needs and technical requirements. • Conduct feasibility studies and proof-of-concepts to validate the integration of new tools and technologies. • Keep abreast of the latest advancements in LLM, CoPilot, and related technologies to identify opportunities for further innovation and improvement. • Understand and leverage MS CoPilot and MS CoPilot Studio for enhanced productivity and collaboration within the development team. • Integrate MS CoPilot tools into existing workflows and ensure seamless integration with other systems and applications used by the team. • Work closely with product managers, data scientists, and other stakeholders to gather requirements and ensure successful integration of LLM and CoPilot technologies. Requirements: • Bachelor’s or Master’s degree in computer science, Engineering, or a related field. • Proven experience in software/system/platform engineering, with a focus on AI tooling and integration. • Strong expertise in working with Azure, including managing and integrating AI services such as OpenAI and Databricks Mosaic AI. • Proficiency in programming languages such as Python, Java, or C++. • Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch). • Solid understanding of software development methodologies, including Agile and DevOps practices and tools for continuous integration and deployment. • Understanding of data security best practices and compliance requirements in software development and integration • Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. • Strong communication and collaboration skills. Preferred Qualification: • Strong proficiency in one or more programming languages such as Python, Java, C#, or JavaScript. • Experience with large language models (LLMs) and familiarity with tools like OpenAI GPT, Google BERT, or similar. • Hands-on experience with Databricks for data engineering, data science, or machine learning workflows. • Proficiency in using Databricks for building and managing data pipelines, ETL processes, and real-time data processing. • Experience with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch). • Hands-on experience with Microsoft CoPilot and CoPilot Studio. • Proficiency in working with APIs, microservices architecture, and web services (REST/SOAP). • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, and their integration services. • Knowledge of database systems, both SQL and NoSQL (e.g., MySQL, MongoDB). • Knowledge of natural language processing (NLP) and large language models (LLMs). • Previous experience in a similar role within a technology-driven organization. • Certifications in Azure, AI, or related areas.
Coral Gables, FL, USA
Negotiable Salary
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.