Muhammad Salman Malik DevOps Engineer
No reviews yet

I am writing to express my interest in the DevOps and Data Science Instructor position. As a seasoned DevOps Engineer and Data Scientist, I bring a unique blend of skills in CI/CD pipeline design, cloud technologies, containerisation, orchestration, and data-driven problem-solving.

My experience spans implementing robust and scalable cloud-based solutions using AWS technologies, leveraging tools like Docker, Kubernetes, Ansible, and Terraform to automate operations, and applying machine learning techniques to real-world challenges. I am also proficient in Python and Shell scripting.

My GitHub showcases projects like Python-based microservices application on AWS Elastic Kubernetes Service (EKS) that not only meets the functional requirements but also exemplifies the best practices in cloud-native development and Kubernetes orchestration. Additionally, a CI/CD pipeline for a Java app, integrating Jenkins, SonarQube, Argo CD, and Kubernetes. Another notable project is a machine learning model for predicting Airbnb prices, leveraging MLOps best practices, Streamlit, Docker, and Kubernetes.

As an instructor, I aim to share my knowledge and passion for DevOps and Data Science, fostering an engaging and interactive learning environment. My unique blend of skills enables me to deliver comprehensive solutions that drive efficiency and innovation, and I am excited about the opportunity to inspire and educate future professionals in these fields.

Subjects

  • Data Science Beginner-Intermediate

  • DevOps Beginner-Intermediate


Experience

  • Senior DevOps Engineer (Feb, 2024Present) at TecBrix Cloud
    • Orchestrated a complex migration to Azure Kubernetes Service (AKS), deploying a multi-tiered, auto-scaling application architecture.
    • Crafted Azure DevOps pipelines integrating Terraform for infrastructure as code (IaC), achieving seamless deployments across Azure, AWS, and Google Cloud Platform.
    • Led Inter-cloud migration projects, transitioning critical services by leveraging advanced data replication techniques and containerization.
    • Engineered a DevSecOps framework incorporating automated security scanning and compliance checks into CI/CD pipelines.
  • Cloud Infrastructure Strategy Consultant (Dec, 2023Present) at Shop Online New York
    • Strategised and designed a scalable and secure AWS cloud infrastructure for the e- commerce project.
    • Provided strategic guidance on leveraging AWS CloudFormation for automating cloud infrastructure provisioning and deploying Jenkins within AWS for CI/CD pipelines.
    • Orchestrated comprehensive training programs in AWS and Azure cloud services, as
    well as DevOps methodologies.
  • DevOps Engineer (Sep, 2023Feb, 2024) at Loop
    • Provisioned and managed Kubernetes clusters using Terraform and kOps.

    • Led the transition of all services from deploy scripts to Docker Compose.

    • Helm for deploying applications and databases including PostgreSQL, MongoDB and RabbitMQ.

    • Setting up Security Onion for network monitoring and CVE tracking using FleetDM.

    • Integrated Sentry, providing real-time error tracking of the deployments.

    • Utilised Ansible for standardising server setups.

    • Real time visibility of cash-in-transit operations using Grafana and Prometheus.

    • Utilising the GitHub Actions Importer for a smooth transition of the CI/CD pipelines.

    • Cloud Migration of selected services from AWS to Hetzner for cost efficiency.
  • Associate DevOps Engineer (Dec, 2022Present) at Rapid Solutions(rapsols)-Amsterdam
    • One of my major accomplishments was designing and implementing the Databricks Data Lakehouse platform.

    • In-house Development environment using Docker and Kubernetes. Using upstream operators like Cloud Native Postgres, Elasticsearch and MariaDB.

    • Kubernetes and Monitoring metrics using the official helm chart Kube Prometheus Stack.

    • Central logging using Loki.

    • Publishing docker images using Artifactory.

    • AWS for cloud-based solutions, leveraging EC2 instances, RDS databases, S3,
    Lambda and CloudWatch to store, process, monitor and manage client data.

    • Team collaboration and continuous improvement in infrastructure development by
    integrating Terraform code with Git.

    • Implemented GitOps best practices and introduced Ansible for configuration
    management, enhancing deployment traceability, enabling automated rollbacks, and
    automating server setup and maintenance.

    • Utilised vSphere for virtualisation, enhancing our infrastructure's efficiency and
    resilience.

    • GitLab CI/CD pipelines assisted to automate the building and deployment of data
    pipelines for ETL processes.

    • Gained proficiency in Linux system administration, ensuring the smooth operation of
    our development environment.

    • Worked on the Hadoop ecosystem, including HDFS and Hive to store and process
    large volumes of structured and unstructured data.

    • Used Jira for collaboration and project management to track project progress and
    ensure timely delivery.• One of
  • Social Media Data Analyst (Jul, 2021Dec, 2021) at Estate1
    • Google Analytics was used to track website traffic generated through social media channels, as well as user behavior such as page views, time on site, and bounce rate to optimize social media content and improve website performance.

    • SQL to extract data from the company's databases and social media platforms, allowing in-depth analysis of customer behavior and social media campaign performance.

    • Python and Excel were utilized for advanced data analysis to predict customer behavior and identify patterns in social media engagement.

    • Tableau assisted in creating reports and dashboards that provided senior management with insights into social media campaign effectiveness, website performance, and other key business metrics.

Education

  • Ms Data Science (Aug, 2022now) from IBA Karachi
  • Bachelor in Business Administration (Sep, 2018Jul, 2022) from SZABIST, Karachi

Fee details

    Rs100,000150,000/month (US$358.77538.16/month)

    Fee can vary depending on the time period. 2 Month package would cost you less per Month and also if you are willing to study in a group of two(online).


Courses offered

  • DevOps & Cloud for Automation

    • US$500
    • Duration: 60 days(3 days/week)
    • Delivery mode: Flexible as per the student
    • Group size: 3
    • Instruction language: English
    • Certificate provided: No
    Course Learning Outcomes:
     
    o CLO1: Introduction to DevOps and Cloud Computing

    o Introduction to DevOps culture and its principles
    o Understanding the principles of DevOps and its relevance to data science
    o Exploring common challenges in data science projects and how DevOps can address them
    o Introduction to Agile methodologies for software development
    o Overview of cloud computing and cloud providers
    o Cloud computing (IaaS, PaaS, SaaS) and deployment models (public, private, hybrid)
     
    o CLO2: Practical Assignments for CLO1:

    o Primer on DevOps, Infrastructure and Cloud Computing, comparison of different cloud deployment and computing models.
    o Create an AWS free tier account and deploy a virtual machine using EC2.
    o Create an Azure free tier account and deploy a virtual machine using Azure VM.
     
    o CLO3: Infrastructure as Code

    o Introduction to infrastructure as code (IaC)
    o Infrastructure as code tools (Terraform, CloudFormation and Ansible)
    o Version control tools (Git)
    o What to include in git and what not (Managing dilemma of secrets committed into git)
     
    o CLO4: Practical Assignment for CLO3:

    o Develop an infrastructure as code (IaC) solution using Terraform to deploy an infrastructure on AWS (using free tier account)
    o Local deployment and development environment using Docker
    o Use Git and a Git repository to manage IaC code.
     
    o CLO5: Containerization

    o Introduction to containerization
    o Docker / Container Runtime
    o Kubernetes High Level Architecture
    o Deploying and managing containers on cloud-based infrastructure
    o Using Helm Charts and Static Manifests
    o Scaling and orchestration of containers
     
    o CLO6: Practical Assignment for CLO5

    o Develop and deploy a containerized application packaged (dummy application, Hello World or Web Server) by creating a Helm Chart and deploy using Docker and Kubernetes on a local environment.
    o Use Kubernetes to manage and orchestrate containers on a cloud-based infrastructure and scale multiple replicas of your deployment.
     
    o CLO7: Continuous Integration and Deployment

    o Introduction to CI/CD pipeline
    o CI/CD tools (Jenkins, GitLab and GitHub Actions)
    o Integration of Automated testing (unit, integration, end-to-end)
     
    o CLO8: Practical Assignment for CLO7:

    o Develop and implement a CI/CD pipeline using Jenkins to automate the build, tests and deployment of an application.
    o Local development environment using Jenkins on Docker
    o Develop and implement a testing strategy for an application integrated in a CI/CD workflow using automated tests.
     
    o CLO9: Monitoring and Logging

    o Understanding the importance of monitoring and logging in DevOps practices
    o Introduction to monitoring tools like Nagios, Prometheus, and Grafana
    o Logging and aggregation tools like Elasticsearch, Logstash, and Kibana (ELK)
    o Benefits of using Elastic Stack
    o Using Opensource fork by Amazon (OpenSearch).
     
    o CLO10: Practical Assignment for CLO9

    o Setting up a monitoring and logging system for an application deployed, create an alert (Email, Slack message or WhatsApp) and logs are available for the deployed workload using Kibana.
     
    o CLO11: Security and Compliance

    o Security best practices for DevOps
    o Students will learn how to integrate security practices into their DevOps workflows, including threat modeling, secure coding practices, and continuous security testing.
    o Securing cloud-based environments
    o Students will explore cloud security concepts and tools, including access control, network security, and identity and access management. They will learn how to apply security measures to cloud infrastructure, as well as how to monitor and audit security events in the cloud.
    o Securing data pipelines
    o Students will learn about data privacy and security regulations, as well as techniques for securing data pipelines and ensuring data integrity. They will also learn how to use encryption and other security measures to protect sensitive data.
     
    o CLO12: Quiz for CLO11

    o On the basis of lectures and provided sources quiz will be conducted to facilitate discussion, communication and problem-solving both as a group and individually.
     
    o CLO13: Design and Implement a Secure and Automated Data Pipeline (Cloud Ready)

    o The final project aims to assess the knowledge and skills acquired during the course by designing and implementing a secure automated data pipeline on the cloud using the DevOps practices learned throughout the course.
    o The project will require students to design and implement an end-to-end data pipeline (using Jenkins or anything else), starting from data ingestion, storage, processing, and analysis, to deployment and monitoring of the final product.
    o The project should be implemented ready to be deployed for a cloud platform such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP).
    o Local docker based development environment will be used (Creating a Docker image is a must)
    o Additional marks for running whole infrastructure locally using Kubernetes
    o The project will be assessed based on the quality of the design and implementation, the level of automation achieved, the effectiveness of the security measures, and the overall performance and usability of the final product.
  • Foundations of Data Science: From Novice to Pro

    • US$800
    • Duration: 90 days
    • Delivery mode: Flexible as per the student
    • Group size: 6 - 10
    • Instruction language: English
    • Certificate provided: No
    Week 1 - Introduction to Data Science
    Day 1: Course introduction, syllabus review, expectations, assessment method
    Day 2: What is data science, fields in data science, applications of data science
    Day 3: Role of a data scientist, required skills, ethics in data science
    Day 4: Data Science tools and libraries overview (Python, R, SQL, Excel)
    Day 5: Introduction to Python for Data Science: basic syntax, data types, variables

    Week 2 - Python for Data Science
    Day 1: Control structures: loops, conditionals, functions
    Day 2: Introduction to Libraries: NumPy and Pandas
    Day 3: Data Structures: Lists, Dictionaries, Sets, Tuples
    Day 4: File handling in Python, Reading and Writing files
    Day 5: Lab session: Exercises and practical implementation

    Week 3 - Data Wrangling with Python
    Day 1: Introduction to Pandas, DataFrames and Series
    Day 2: Importing data, cleaning data
    Day 3: Data manipulation: filter, sort, groupby, merge
    Day 4: Handling missing data, data formatting, data standardization
    Day 5: Lab session: Exercises and practical implementation

    Week 4 - Exploratory Data Analysis (EDA)
    Day 1: Introduction to EDA, descriptive statistics
    Day 2: Data visualisation with Matplotlib
    Day 3: Data visualisation with Seaborn
    Day 4: Correlation, Covariance, Outliers detection
    Day 5: Lab session: EDA on different datasets

    Week 5 - Introduction to SQL for Data Science
    Day 1: Introduction to SQL, DDL, DML
    Day 2: Basic SQL queries: SELECT, WHERE, ORDER BY
    Day 3: SQL JOINs, UNION, Aggregations (GROUP BY, HAVING)
    Day 4: SQL subqueries, functions
    Day 5: Lab session: SQL exercises

    Week 6 - Advanced SQL
    Day 1: Database design: normalisation, keys, index
    Day 2: Advanced SQL functions: analytical functions, stored procedures
    Day 3: Working with large datasets in SQL
    Day 4: Integrating SQL with Python
    Day 5: Lab session: Advanced SQL exercises

    Week 7 - Probability and Statistics for Data Science
    Day 1: Introduction to Probability: Basic concepts, probability rules
    Day 2: Probability Distributions: Binomial, Poisson, Normal distributions
    Day 3: Introduction to Statistics, descriptive vs inferential statistics
    Day 4: Statistical measures: mean, median, mode, variance, standard deviation
    Day 5: Hypothesis testing, p-value, confidence interval

    Week 8 - Introduction to Machine Learning
    Day 1: What is Machine Learning, Types of Machine Learning
    Day 2: Introduction to Supervised Learning: Regression, Classification
    Day 3: Introduction to Unsupervised Learning: Clustering, Dimensionality Reduction
    Day 4: Introduction to Reinforcement Learning
    Day 5: Evaluating Machine Learning models: accuracy, precision, recall, F1-score, AUC-ROC

    Week 9 - Regression Techniques in Machine Learning
    Day 1: Simple Linear Regression
    Day 2: Multiple Linear Regression
    Day 3: Polynomial Regression
    Day 4: Logistic Regression
    Day 5: Lab session: Regression exercises using sklearn

    Week 10 - Classification Techniques in Machine Learning
    Day 1: K-Nearest Neighbors (KNN)
    Day 2: Support Vector Machine (SVM)
    Day 3: Decision Trees and Random Forest
    Day 4: Naive Bayes Classifier
    Day 5: Lab session: Classification exercises using sklearn

    Week 11 - Advanced Topics in Data Science
    Day 1: Introduction to Big Data: Hadoop, Spark
    Day 2: Introduction to Cloud Computing in Data Science: AWS, Google Cloud, Azure
    Day 3: Overview of Natural Language Processing (NLP)
    Day 4: Overview of Computer Vision, Convolutional Neural Networks
    Day 5: Introduction to Time Series Analysis, ARIMA

    Week 12 - Final Project Work
    Final Project: Applying the techniques learned throughout the course to solve a real-world data science problem.

Reviews

No reviews yet. Be the first one to review this tutor.