Naeem Shahzad

I am working as a Principal Backend Developer with extensive experience in developing and maintaining efficient, scalable, and reliable data pipelines & backend systems using Python, SQL, NoSQL, ETLs, data visualization, and cloud computing. My expertise includes working with various technologies such as Scrapy, Selenium, BeautifulSoup, Pandas, Spark, AWS, GCP, Elasticsearch, Django, and Docker. I have experience in working independently and delivering high-quality results. I am always eager to take on new challenges and continuously improve my skills.

Experience

Principal Backend Developer

xiQ, Inc.
  • Drive the development of new features and functionalities, working closely with product managers, designers, and frontend developers to translate requirements into technical solutions.
  • Provided mentorship and guidance to junior developers, helping them grow their technical skills and advance their careers.
April 2024 - Present

Senior Backend Developer

xiQ, Inc.
  • Designed and developed Gilroy, a chatbot copywriter, resulting in an 80% increase in user engagement on the platform.
  • Developed innovative features such as user group sharing and user certification, contributing to increased platform usability and credibility.
  • Identified and resolved bugs that arose during the development process.
  • Managed cloud services and resources, ensuring systems ran smoothly in the cloud.
  • Collaborated closely with frontend developers to ensure seamless integration between the frontend and backend components of the application.
  • Optimized performance bottlenecks, resulting in a substantial improvement in system efficiency and responsiveness.
July 2022 - March 2024

Data Engineer

Datalytix.ai
  • Constructed a data pipeline to process semi-structured data by integrating 400 million raw records from 20+ sources using Elastic Search and Python.
  • Integrated data from multiple third-party systems for social media monitoring, customer engagement, and voice analytics.
  • Developed REST APIs with Django and Gunicorn, created Docker images, and deployed them on serverless infrastructure.
  • Built web scraping tools for optimal data extraction from social media sites using Selenium and BeautifulSoup.
  • Integrated machine learning models with AI platforms such as Vertex AI for enhanced data analysis.
  • Designed batch and real-time data processing pipelines for machine learning models to make predictions on raw data.
  • Developed a distributed, scalable, unsupervised web crawler using Scrapy, Redis, and MySQL, which aggregated millions of news articles weekly.
  • Optimized data workflows to ensure data quality and reliability.
  • Led a team of 1 data engineer and 2 data scientists, and collaborated with project managers and analysts to deliver key initiatives.
September 2018 - July 2022

Software Developer

PeoplePerHour
  • Developed Selenium bots using Python to automate repetitive tasks, increasing efficiency and reducing manual effort.
  • Built tools to manage product stock for a drop-shipping business, ensuring accurate inventory tracking.
  • Created tools to detect and delete snipe bids on eBay live auctions, improving auction reliability and competitiveness.
  • Provided web scraping and web crawling services for social media and web platforms to gather valuable data insights.
January 2017 - September 2018

Software Developer

Fiverr
  • Developed automation bots using Python and Selenium to handle repetitive and mundane tasks, improving workflow efficiency.
  • Built lead generation tools for multiple industries, reducing manual work by 95% and accelerating customer acquisition processes.
  • Extracted data from social networking sites for keyword monitoring to support digital marketing strategies.
  • Created price-monitoring tools for various marketplaces to identify and secure the best deals available.
  • Parsed data from HTML, XML, and PDF formats and saved output to Excel, Word, PostgreSQL, and NoSQL databases for further analysis.
  • Extracted data from files for wholesaler websites, making it uploadable to WooCommerce stores for e-commerce optimization.
  • Deployed scripts on AWS, Google Cloud Platform, and DigitalOcean VMs for scalable and efficient application hosting.
  • Developed ETL and automated data pipelines using Spark to process and transform large data sets.
  • Performed basic data visualization using Tableau and PostgreSQL to support data-driven decision-making.
  • Developed APIs with Django and integrated with third-party APIs to extend platform capabilities.
  • Conducted remote maintenance and troubleshooting to ensure seamless system operation and reduce downtime.
February 2016 - September 2018

Education

Information Technology University (ITU), Lahore

Bachelor of Science
Computer Science
September 2014 - June 2018

Skills

Programming Languages & Tools
Workflow
  • End-to-End Development
  • Version Control with Git
  • Cross Functional Teams
  • Cloud Deployment
  • Automation & Scripting
  • Agile Development & Scrum

Interests

Beyond developing innovative software solutions, I am passionate about exploring the latest advancements in artificial intelligence and machine learning, particularly in the context of automation and data-driven applications. I enjoy experimenting with new technologies, building side projects, and contributing to open source initiatives.

In my free time, I love traveling, hiking, and exploring nature, which helps me recharge and stay active. I also enjoy reading about technology trends.