|
A job opportunity
|
|
|
|
A job opportunity exists for the right candidate in Johannesburg
|
|
Data Scientist
|
Employment Type: Permanent
|
Reference No: 1000103
|
Johannesburg
|
| |
|
Employment Equity:
|
Historically disadvantaged South Africans will be given preference.
Nando's shall apply the affirmative action principals as set out in the Employment Equity Policy.
|
| |
|
Requirements:
|
The Data Scientist’s primary responsibility is to leverage data and advanced analytics to deliver actionable business insights, build and deploy machine learning models, and support business intelligence initiatives. The role focuses on developing robust analytical solutions, collaborating with stakeholders, and ensuring operational excellence in data science projects within the Nando’s SA Region.
Minimum Requirements:
• 3 – 5 years’ experience as a Data Scientist or in a similar analytical role • Experience in building and deploying machine learning models in production environments • Experience developing forecasting models (e.g., time series) and classification models for predictive analytics • Experience with data preprocessing, feature engineering, and model evaluation techniques • Experience with Python programming and ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch) • Experience with SQL for data extraction and manipulation • Experience in developing dashboards and reports using Power BI • Experience implementing MLOps practices such as CI/CD, model monitoring, and retraining • Experience working with cloud platforms such as Microsoft Azure, AWS, or Google Cloud Platform • Experience in documenting analytical processes, models, and workflows • Experience collaborating with cross-functional teams to deliver data-driven solutions.
Detail Knowledge: • Knowledge of machine learning algorithms and statistical modelling concepts (Advanced) • Knowledge of data preprocessing, feature engineering, and model evaluation techniques (Advanced) • Knowledge of Python programming and relevant libraries (Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch) (Advanced) • Knowledge of SQL and database querying for data extraction and manipulation (Intermediate) • Knowledge of MLOps practices, including CI/CD, model versioning, and monitoring (Intermediate) • Knowledge of cloud-based ML platforms (Azure Machine Learning, AWS SageMaker, GCP AI Platform) (Intermediate) • Knowledge of data visualisation and dashboard development using Power BI (Advanced) • Knowledge of data governance, privacy regulations (POPIA, GDPR), and ethical AI principles (Intermediate) • Ability to design and implement scalable machine learning pipelines (Intermediate) • Ability to communicate complex analytical concepts to technical and non-technical audiences (Advanced) • Ability to work independently and manage multiple projects simultaneously (Advanced) • Ability to adapt to emerging technologies and continuously learn new tools and techniques (Intermediate)
Position Description:
Data Analysis & Modelling: • Collect, clean, and preprocess structured and unstructured data from various sources. • Apply statistical analysis and machine learning techniques to extract insights and solve business problems. • Build, validate, and optimise predictive and classification models using Python and relevant ML frameworks (e.g., Scikit-learn, TensorFlow, PyTorch). • Document modelling approaches and results for reproducibility and knowledge sharing.
Business Intelligence & Reporting: • Translate analytical findings into clear, visual data stories for stakeholders. • Collaborate with business users to understand requirements and deliver relevant dashboards and reports. • Ensure reports are accurate, timely, and aligned with business objectives.
Collaboration & Stakeholder Engagement: • Establish and maintain effective working relationships with all internal and external stakeholders • Work closely with data engineers and other team members to define project requirements and deliver solutions. • Participate in cross-functional meetings to identify analytical opportunities and communicate project progress.
MLOps & Model Deployment: • Develop and maintain scalable machine learning pipelines for data ingestion, feature engineering, model training, and inference. • Deploy models into production environments, ensuring reliability, performance, and compliance. • Implement MLOps best practices, including CI/CD, model versioning, monitoring, and retraining strategies. • Troubleshoot and resolve issues related to model performance and data quality in production.
Data Quality & Compliance: • Ensure data quality through validation, profiling, and cleansing activities. • Adhere to data privacy regulations and ethical AI principles in all analytical outputs. • Maintain compliance with company policies, technology standards, and industry regulations (e.g., POPIA, GDPR, PCI DSS, HIPAA). • Document data sources, transformations, and model decisions for auditability and transparency. • Comply with all electronic and physical security procedures and standards when handling sensitive data. • Enforce Nando’s information security policies and procedures within analytics workflows. • Implement data access controls to ensure only authorised personnel can access sensitive data. • Apply encryption techniques to protect sensitive data both in transit and at rest. • Conduct regular vulnerability assessments and monitor systems for potential security breaches or suspicious activity. • Suggest modifications and additions to standards and guidelines for data governance and security. • Assist in enforcing business process controls such as self-assessments and assurance reviews to ensure data integrity. • Support business continuity and disaster recovery planning for analytics systems and data pipelines. • Educate team members on security best practices and compliance requirements related to data science projects.
Documentation & Communication: • Document model architectures, data pipelines, and operational procedures. • Prepare technical reports and presentations for both technical and non-technical audiences. • Communicate complex analytical findings and model implications effectively to stakeholders and leadership. • Maintain clear records of project milestones and deliverables.
Continuous Learning & Improvement: • Stay updated with the latest developments in data science, machine learning, and MLOps. • Evaluate and integrate new tools, techniques, and platforms to enhance analytical capabilities. • Participate in relevant training, certification programmes, and knowledge-sharing sessions. • Proactively seek feedback and opportunities for process improvement
User Support and Training: • Provide 2nd-level support for end users • Escalate urgent and unresolved tickets to the relevant parties • Work with business stakeholders, DBA, and BI & Data Engineers to support the activities and operational procedures required to deliver data services, including standard operating procedures and monitoring activities. • Prepare or assist in the preparation of data training content in the quest to make Nando’s a data-driven organisation.
Change and Release Management: • Implement data system changes in a controlled manner, including standard changes and emergency maintenance relating to business processes, applications, and infrastructure. • Enable fast and reliable delivery of change to the business and mitigate the risk of negatively impacting the stability of the changed environment. • Successfully deploy new data solutions and services in line with the agreed-on expectations and outcomes. • Ensure that the implementation of new solutions and services has the necessary support, from planning to execution to post-implementation support and staffing.
Records Management and Reporting: • Maintain all IT records and tracking for their area of responsibility and provide managers and users with regular updates, as well as any relevant status and progress information • Maintain a record of all inquiries from the initial call to incident resolution and provide the necessary information and documentation for issues that require escalation • Prepare and maintain records of assigned work orders and work performed, including inserting and updating computerised service tickets • Create and maintain documentation for all support processes • Complete the relevant documentation for in-house hardware and software systems as requested.
|
| |
|
Remuneration:
|
| |
|
|
| | | Special Requirements: |
|---|
Advantageous: • Master’s degree in a relevant field. • Certifications such as Microsoft Certified: Azure Data Scientist Associate DP-100, Azure AI Engineer Associate AI-102, Azure Fundamentals AZ-900, Business Data Analytics (IIBA®-CBDA), Databricks Certified Machine Learning Associate. • Experience in the restaurant, hospitality, or retail industry • Experience with various BI tools such as Power BI • Experience with Generative AI models and Large Language Models (LLMs) (e.g., GPT, BERT) for tasks such as text summarisation, feature extraction, or conversational analytics • Experience with containerisation and orchestration tools (Docker, Kubernetes) • Project management skills for handling multiple analytics initiatives • SLA Management Skills • Experience with big data technologies such as Hadoop and Spark
|
| |
|
Starting Date:
|
|
2026/01/16
|
| |
|
PLEASE NOTE
|
|
- Closing date: 2026/01/26
|
| |
|
SEAMLESS EMPLOYEE ENGAGEMENT
|
|