Data/Machine Learning Infrastructure Engineer
Overview
Wavelo is a SaaS business on a mission to make telecoms a breeze. We provide flexible software that modernizes how communication service providers (CSPs) do business, helping them drive more value, focus on customer experience, and scale their operations faster.
Job Description
Wavelo builds cutting-edge software for the telecom industry, specializing in billing, orchestration, and provisioning solutions. Our mission is to empower telecom providers with robust tools that streamline operations and drive impactful decisions. We are now expanding our AI capabilities to derive deep insights from our event-stream data, enabling us to take meaningful actions that enhance our products and customer experience.
Responsibilities
- - Deploy AI and machine learning models within Wavelo''s production environment, aligning with our software architecture to drive actionable insights from our telecom event streams.
- - Develop and maintain Kafka pipelines to support real-time data processing, facilitating the flow of critical insights and enabling swift actions on event-stream data.
- - Manage model training, tuning, and retraining workflows, ensuring that models are optimized to capture actionable insights from our event streams.
- - Work with data engineers, product teams, and other stakeholders to align on data needs, model requirements, and integration objectives.
- - Develop and implement monitoring solutions for model performance within the event-stream environment, identifying and resolving issues proactively.
- - Document processes, model configurations, and best practices to promote knowledge sharing and continuous improvement across Wavelo.
Required Skills
- - Extensive experience with machine learning, deep learning, and AI data processing clusters, including Apache Spark and other relevant technologies.
- - Exposure to training models either locally or through cloud platforms like Google Vertex AI, AWS SageMaker, or Azure ML.
- - Proficiency with Apache Kafka for setting up, managing, and scaling event-stream pipelines.
- - Familiarity with LangChain and related AI/ML frameworks (TensorFlow, PyTorch, scikit-learn) and cloud environments (AWS, Azure, Google Cloud).
- - Experience in RESTful APIs, microservices, and containerization technologies such as Docker and Kubernetes.
- - Solid grounding in data engineering practices, with a strong emphasis on handling and cleaning messy, unstructured data to ensure quality for downstream machine learning tasks.
- - Analytical and proactive approach to resolving integration and performance challenges.
- - Strong ability to convey technical concepts to both technical and non-technical stakeholders, with a collaborative and team-oriented mindset.
Benefits
- - Fair compensation and generous benefits
- - Remote work flexibility
- - Commitment to inclusion and diversity
- - Support for individuals with disabilities
- - Participation in the E-verify program for US employees
About the company
Wavelo builds cutting-edge software for the telecom industry, specializing in billing, orchestration, and provisioning solutions. Our mission is to empower telecom providers with robust tools that streamline operations and drive impactful decisions.