- Benoni
- Salary: Market Related
- Job Type: Permanent
- Sectors: Call Centre IT Insurance
- Reference: 2265383
- Employment Equity Position
Vacancy Details
Job Summary:
- Software Developer role is key in building an advanced data management and analytics platform, integrating with multiple external APIs, and optimising real-time data workflows.
Key Performance Areas
- Develop and optimise Perl-based backend applications for data processing, analytics, and automation (or be willing to learn Perl quickly).
- Implement Perl OOP best practices and efficient data structures for large-scale processing.
- Build and maintain PHP-based applications and backend services.
- Design and manage high-performance Redis caching and data structures for real-time processing.
- Develop and maintain API integrations (REST, Webhooks) with external services (CRMs, analytics platforms, etc.).
- Work with JSON and JavaScript for API handling, data transformation, and UI interactions.
- Optimize MySQL databases, focusing on performance tuning, indexing, and partitioning.
- Deploy and manage Ubuntu Linux environments for high-traffic applications.
- Implement AI-based analytics using TensorFlow, vLLM, or Olama to classify and analyse very large datasets.
- Ensure system reliability, security, and scalability across all integrations.
3. Key Tasks
Data Analysis and Interpretation
- Study statistical data to create analyses on product performance.
- Analyse data to identify trends and patterns that impact decision-making.
- Support the actuarial first-line team in the preparation of the statutory returns.
- Support the actuarial first-line team in the pricing, profit testing and viability analysis of new products or enhancements on existing products.
- Support the actuarial first-line team in producing the financial projections for budgeting, ORSA and any other similar risk reporting requirements.
- Use a variety of predictive models to project outcomes, such as future events, claims ratios, risks events, or financial performance.
Develop and Optimize Perl-Based Backend Applications for Data Processing, Analytics, and Automation (or be willing to learn Perl quickly)
- Design and implement robust Perl scripts for data ingestion, transformation, and processing.
- Automate data workflows and reporting mechanisms.
- Optimise existing Perl code for better performance and scalability.
Implement Perl OOP Best Practices and Efficient Data Structures for Large-Scale Processing
- Utilise Object-Oriented Programming principles to structure code for maintainability and efficiency.
- Design and implement advanced data structures to handle large datasets.
- Conduct code reviews to ensure adherence to best practices.
- Build and Maintain PHP-Based Applications and Backend Services:
- Develop secure and scalable PHP applications.
- Integrate PHP services with front-end applications and third-party APIs.
- Perform code optimisation and refactoring to improve performance.
Design and Manage High-Performance Redis Caching and Data Structures for Real-Time Processing
- Implement Redis caching strategies to enhance application performance.
- Manage Redis data structures for efficient real-time data access.
- Monitor and troubleshoot Redis-related issues.
Develop and Maintain API Integrations (REST, Webhooks) with External Services (CRMs, Analytics Platforms, etc.)
- Design and implement RESTful APIs and Webhooks.
- Integrate with third-party services such as CRMs and analytics platforms.Ensure secure and reliable data exchange between systems.
Work with JSON and JavaScript for API Handling, Data Transformation, and UI Interactions
- Parse and manipulate JSON data for API communication.
- Develop JavaScript functions for data transformation and front-end interactions.
- Collaborate with front-end developers to integrate APIs into user interfaces.
Optimise MySQL Databases, Focusing on Performance Tuning, Indexing, and Partitioning
- Analyse and optimise SQL queries for performance.
- Implement indexing strategies to improve database efficiency.
- Use partitioning techniques to manage large datasets effectively.
- Deploy and Manage Ubuntu Linux Environments for High-Traffic Applications:
- Configure and maintain Ubuntu servers for optimal performance.
- Automate deployment processes using tools like Ansible or Docker.
- Monitor server health and implement security measures.
Develop Internal Tools and Dashboards Using Bootstrap 5, JavaScript, and WebSockets
- Create responsive and user-friendly dashboards with Bootstrap 5.
- Utilize WebSockets for real-time data updates.
- Build internal tools to support operational and analytical needs.
Implement AI-Based Analytics Using TensorFlow, vLLM, or Olama to Classify and Analyse Very Large Datasets
- Develop machine learning models for data classification and analysis.
- Integrate AI solutions into existing data workflows.
- Analyse large datasets to derive actionable insights.
Ensure System Reliability, Security, and Scalability Across All Integrations
- Implement monitoring and logging solutions to ensure system reliability.
- Apply security best practices to protect data and applications.
- Design systems for scalability to handle growing data volumes and user demands.
Required Skills
- Mysql: 2 to 3 years
- Perl Programming: 1 to 2 years
- Php: 2 to 3 years
- Json: 1 to 2 years
- Python: 1 to 2 years
- Ubuntu Linux: 1 to 2 years
- Redis: 2 to 3 years
- Restful: 2 to 3 years
- UI/UX: 2 to 3 years
Candidate Requirements
4. Qualifications/ Experience
- Expertise in Perl OOP and data structures (or willingness to learn quickly).
- Strong experience with Redis, including caching strategies, pub/sub, and data structure optimisation.
- Proficiency in PHP for backend services and API handling.
- Deep understanding of API integrations (REST, Webhooks).
- JSON and JavaScript proficiency for data handling and UI interactions.
- MySQL expertise, including query optimization and indexing.
- Ubuntu Linux administration, particularly in high-load environments.
- Apache web server experience, including configuration, performance tuning, and security.
- Experience with LLM integration and text processing using Perl, Python & Language models a bonus.
- Experience working with high-throughput data processing and automation.
- Strong problem-solving and debugging skills.
- Experience integrating telephony platforms (e.g., Asterisk, Twilio, Five9, Vicidial, etc.).
- Strong UI/UX skills for internal dashboards using Bootstrap 5.
1 person has applied for this job. 107 people have viewed this job.
Similar Jobs
Senior Software Developer – AI, API Integrations & High Performance Systems
- Benoni
- Job Type: Permanent
- Posted 13 Feb 2025 | 31 Days left
Senior Software Developer – AI, API Integrations & High Performance Systems
- Benoni
- Job Type: Permanent
- Posted 25 Feb 2025 | 43 Days left