๐๐๐'๐๐ ๐ต๐๐๐ ๐ฐ๐๐๐๐๐๐: ๐ฟ๐๐ ๐ฎ๐๐๐๐๐๐๐๐ ๐๐ ๐พ๐๐๐๐๐ ๐ฝ๐๐
๐๐ฎ๐ผ๐ฒ๐ฐ๐ท๐ฎ๐ญ, ๐๐ฎ๐ฟ๐ฎ๐ต๐ธ๐น๐ฎ๐ญ & ๐๐ฎ๐น๐ต๐ธ๐๐ฎ๐ญ ๐๐ฒ๐ฝ๐ฑ ๐๐ป๐ฎ๐ฌ๐ฒ๐ผ๐ฒ๐ธ๐ท
๐๐ฅ๐ฅ ๐๐ข๐ ๐ก๐ญ๐ฌ ๐๐๐ฌ๐๐ซ๐ฏ๐๐
I am a ๐ฑ๐ข๐ด๐ด๐ช๐ฐ๐ฏ๐ข๐ต๐ฆ ๐๐ฐ๐ฎ๐ฑ๐ถ๐ต๐ฆ๐ณ ๐๐ค๐ช๐ฆ๐ฏ๐ค๐ฆ & ๐๐ฏ๐จ๐ช๐ฏ๐ฆ๐ฆ๐ณ๐ช๐ฏ๐จ ๐ด๐ต๐ถ๐ฅ๐ฆ๐ฏ๐ต ๐ข๐ต ๐๐๐, ๐๐ฐ๐ญ๐ฌ๐ข๐ต๐ข, with a keen interest in Artificial Intelligence and Machine Learning development. Eager to explore the realms of ๐๐ ๐๐ง๐ ๐๐, I am committed to leveraging technology to drive innovation and solve complex problems.
๐ก ๐ผ๐ง๐๐๐จ ๐ค๐ ๐๐ฃ๐ฉ๐๐ง๐๐จ๐ฉ ๐ก:
โ
๐๐ณ๐ต๐ช๐ง๐ช๐ค๐ช๐ข๐ญ ๐๐ฏ๐ต๐ฆ๐ญ๐ญ๐ช๐จ๐ฆ๐ฏ๐ค๐ฆ.
โ
๐๐ข๐ค๐ฉ๐ช๐ฏ๐ฆ ๐๐ฆ๐ข๐ณ๐ฏ๐ช๐ฏ๐จ.
โ
๐๐ข๐ต๐ข ๐๐ค๐ช๐ฆ๐ฏ๐ค๐ฆ.
โ
๐๐ข๐ต๐ถ๐ณ๐ข๐ญ ๐๐ข๐ฏ๐จ๐ถ๐ข๐จ๐ฆ ๐๐ณ๐ฐ๐ค๐ฆ๐ด๐ด๐ช๐ฏ๐จ.
โ
๐๐จ๐ฆ๐ฏ๐ต๐ช๐ค ๐๐ / ๐๐ฆ๐ฏ ๐๐.
๐ ๐๐๐๐๐๐๐๐๐ ๐:
โ
๐๐ฆ๐ฎ๐ฃ๐ฆ๐ณ ๐ฐ๐ง ๐๐ญ๐ข๐ค๐ฆ๐ฎ๐ฆ๐ฏ๐ต ๐๐ฆ๐ฑ๐ข๐ณ๐ต๐ฎ๐ฆ๐ฏ๐ต ๐๐ฐ๐ฎ๐ฎ๐ถ๐ฏ๐ช๐ต๐บ ๐๐๐-๐๐๐ ๐๐ณ๐ฐ๐ถ๐ฑ.
โ
๐๐ฏ๐ฅ๐ถ๐ด๐ต๐ณ๐บ ๐๐ฐ๐ฎ๐ฎ๐ถ๐ฏ๐ช๐ต๐บ ๐ฎ๐ฆ๐ฎ๐ฃ๐ฆ๐ณ ๐ฐ๐ง ๐๐๐ ๐๐ฆ๐ฏ๐๐ ๐๐๐.
โ
๐๐ข๐ต๐ข ๐๐ฏ๐ข๐ญ๐บ๐ด๐ต ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ฆ๐ค๐ฉ๐ด๐ฅ ๐๐ฏ๐ฏ๐ฐ๐ท๐ข๐ต๐ช๐ฐ๐ฏ๐ด ๐๐ฐ๐ฏ๐ด๐ถ๐ญ๐ต๐ช๐ฏ๐จ ๐๐ท๐ต ๐๐ช๐ฎ๐ช๐ต๐ฆ๐ฅ ๐ธ๐ฐ๐ณ๐ฌ๐ช๐ฏ๐จ ๐ถ๐ฏ๐ฅ๐ฆ๐ณ ๐ข ๐ฑ๐ณ๐ฐ๐ซ๐ฆ๐ค๐ต ๐ช๐ฏ ๐๐ช๐ฏ๐ฅ๐ถ๐ด๐ต๐ข๐ฏ ๐๐ฐ๐ฑ๐ฑ๐ฆ๐ณ ๐๐ช๐ฎ๐ช๐ต๐ฆ๐ฅ - ๐๐ฐ๐ท๐ต ๐ฐ๐ง ๐๐ฏ๐ฅ๐ช๐ข .
โ
๐๐ฆ๐ฎ๐ฃ๐ฆ๐ณ ๐ฐ๐ง ๐๐ฐ๐ฐ๐จ๐ญ๐ฆ ๐๐ฆ๐ท๐ฆ๐ญ๐ฐ๐ฑ๐ฆ๐ณ ๐๐ณ๐ฐ๐ถ๐ฑ,๐๐๐.
โ
๐๐ณ๐ต๐ช๐ง๐ช๐ค๐ช๐ข๐ญ ๐ช๐ฏ๐ต๐ฆ๐ญ๐ญ๐ช๐จ๐ฆ๐ฏ๐ค๐ฆ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ช๐ฏ๐ฏ๐ข๐ค๐ญ๐ฆ ๐๐ข๐ฃ๐ด.
โ
๐๐ข๐ค๐ฉ๐ช๐ฏ๐ฆ ๐๐ฆ๐ข๐ณ๐ฏ๐ช๐ฏ๐จ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ฌ๐ช๐ญ๐ญ๐ค๐ณ๐ข๐ง๐ต ๐๐ฆ๐ค๐ฉ๐ฏ๐ฐ๐ญ๐ฐ๐จ๐บ.
โ
๐๐ฆ๐ฎ๐ฃ๐ฆ๐ณ ๐ฐ๐ง ๐๐ฐ๐ฅ๐ช๐ฏ๐จ ๐๐ญ๐ถ๐ฃ ๐๐ฏ๐ฅ๐ช๐ข.
โ
๐๐ข๐ฎ๐ฑ๐ถ๐ด ๐๐ฎ๐ฃ๐ข๐ด๐ด๐ข๐ฅ๐ฐ๐ณ ๐ง๐ฐ๐ณ ๐๐ช๐ณ๐ญ๐๐ค๐ณ๐ช๐ฑ๐ต ๐๐ถ๐ฎ๐ฎ๐ฆ๐ณ ๐ฐ๐ง ๐๐ฐ๐ฅ๐ฆ 2025.
โ
๐๐ข๐ค๐ฉ๐ช๐ฏ๐ฆ ๐๐ฆ๐ข๐ณ๐ฏ๐ช๐ฏ๐จ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐๐ฆ.
โ
๐๐ถ๐ฎ๐ฎ๐ฆ๐ณ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐๐ - ๐๐๐๐,๐๐ฆ๐ฏ๐๐ ๐๐๐.
โ
๐๐ข๐ค๐ฉ๐ช๐ฏ๐ฆ ๐๐ฆ๐ข๐ณ๐ฏ๐ช๐ฏ๐จ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ฏ๐ช๐ง๐ช๐ฆ๐ฅ ๐๐ฆ๐ฏ๐ต๐ฐ๐ณ.
โ
๐๐ฆ๐ฏ๐ฆ๐ณ๐ข๐ต๐ช๐ท๐ฆ ๐๐ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ณ๐ฐ๐ฅ๐ช๐จ๐บ ๐๐ฏ๐ง๐ฐ๐ต๐ฆ๐ค๐ฉ.
โ
๐๐ข๐ค๐ฉ๐ช๐ฏ๐ฆ ๐๐ฆ๐ข๐ณ๐ฏ๐ช๐ฏ๐จ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐ฐ๐ฅ๐ฆ ๐ข๐ญ๐ฑ๐ฉ๐ข.
โ
๐๐บ๐ต๐ฉ๐ฐ๐ฏ ๐๐ฆ๐ท๐ฆ๐ญ๐ฐ๐ฑ๐ฆ๐ณ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐๐ ๐๐ฆ๐ฏ๐๐ ๐๐๐.
โ
๐๐ข๐ค๐ฉ๐ช๐ฏ๐ฆ ๐๐ฆ๐ข๐ณ๐ฏ๐ช๐ฏ๐จ ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐ข๐ต ๐๐๐๐๐๐ ๐๐ฆ๐ค๐ฉ.
โ
๐๐ถ๐ฏ๐ช๐ฐ๐ณ ๐
๐ฆ๐ด๐ฆ๐ข๐ณ๐ค๐ฉ ๐๐ด๐ด๐ช๐ด๐ต๐ข๐ฏ๐ต ๐ข๐ต ๐๐๐๐ ๐๐ข๐ฃ, ๐๐๐.
โ Attended Study Abroad Program at ๐๐๐ฉ๐๐ค๐ฃ๐๐ก ๐๐ฃ๐๐ซ๐๐ง๐จ๐๐ฉ๐ฎ ๐๐ ๐๐๐ฃ๐๐๐ฅ๐ค๐ง๐ (๐๐๐)
โ Achieved Certifications based on ๐ผ๐/๐๐,๐พ๐ฎ๐๐๐ง๐๐๐๐ช๐ง๐๐ฉ๐ฎ,๐ฟ๐๐ฉ๐๐๐๐๐๐ฃ๐๐ ๐๐ง๐ค๐ข ๐พ๐ค๐ช๐ง๐จ๐๐ง๐,๐๐๐ฃ๐ ๐๐๐ก๐ฃ ๐๐๐๐ง๐ฃ๐๐ฃ๐,๐ฟ๐๐ก๐ค๐๐ฉ๐ฉ๐
โ Solved DSA problems on platforms like๐๐๐๐ฉ๐พ๐ค๐๐, ๐๐๐๐ ๐๐ง๐๐๐ฃ๐ .
โ Completed Internships on ๐ผ๐ง๐ฉ๐๐๐๐๐๐๐ก ๐๐ฃ๐ฉ๐๐ก๐ก๐๐๐๐ฃ๐๐,๐๐๐๐๐๐ฃ๐ ๐๐๐๐ง๐ฃ๐๐ฃ๐ ๐๐ฃ๐ ๐๐ฎ๐ฉ๐๐ค๐ฃ ๐ฟ๐๐ซ๐๐ก๐ค๐ฅ๐๐ง
โ
Completion of 1 month internship at ๐๐ฐ๐ฅ๐ฆ๐๐ญ๐ฑ๐ฉ๐ข on Machine Learning.
โ Completion of 1 month internship at ๐๐ฏ๐ช๐ง๐ช๐ฆ๐ฅ ๐๐ฆ๐ฏ๐ต๐ฐ๐ณ on Machine Learning.
โ Completion of 1 month internship at ๐๐ฏ๐ต๐ฆ๐ณ๐ฏ ๐๐ฆ๐ณ๐ต๐ช๐ง๐บ as Python Developer.
โ Completion of 1 month internship at ๐๐ณ๐ฐ๐ฅ๐ช๐จ๐บ ๐๐ฏ๐ง๐ฐ๐ต๐ฆ๐ค๐ฉ on Generative AI.
โ Completion of 1 month internship at ๐๐ญ๐ง๐ช๐ฅ๐ฐ ๐๐ฆ๐ค๐ฉ on Generative AI.
Description: A machine learning project for recognizing emotions from speech by analyzing vocal features like pitch, tone, and intensity to enhance human-computer interaction and emotional analysis.
Tools & Technologies:
Python Libraries:Libraries like librosa, numpy, scikit-learn, matplotlib for audio processing, numerical tasks, ML utilities, and plotting.
Deep Learning Framework :TensorFlow and its high-level API Keras for building and training the LSTM deep learning model.
KaggleHub : kagglehub is utilized to directly download the RAVDESS emotional speech audio dataset,
Description: A machine learning model that predict diseases using patient data for early detection, better outcomes, and optimized healthcare resources.
Tools & Technologies:
Python Libraries : libraries like pandas, scikit-learn, matplotlib, and seaborn for data handling, preprocessing, and visualization.
TensorFlow/Keras : For building and training the neural network model.
StandardScaler and LabelEncoder: For feature scaling and target encoding.
Description: A machine learning model that predicts breast cancer using medical data for early detection and improved treatment outcomes.
Tools & Technologies:
Python Libraries : libraries like pandas, scikit-learn, matplotlib, and seaborn for data handling, preprocessing, and visualization.
TensorFlow/Keras : For building and training the neural network model.
StandardScaler and LabelEncoder: For feature scaling and target encoding.
Description: A machine learning model that predicts diabetes using patient data for early detection, improved treatment planning, and better healthcare outcomes.
Tools & Technologies:
Python Libraries: Used libraries like pandas, scikit-learn, matplotlib, and seaborn for data loading, preprocessing, and visualization.
Scikit-learn Models: Applied Logistic Regression, Random Forest, and other classifiers for prediction.
StandardScaler and LabelEncoder: For feature scaling and encoding categorical data.
Jupyter Notebook/Google Colab: Used as the development environment for writing and testing code.
Description: A sleek and interactive web application that provides real-time weather data, air quality index, and hourly forecasts using city input or your current location. Designed for usability and smart forecasting.
Tools & Technologies:
HTML, CSS, JavaScript: Built a responsive and visually appealing frontend with animated transitions and interactive features.
OpenWeatherMap API: Used to fetch live weather, AQI, and forecast data.
Dynamic UI Features: Includes sun/moon switching, background changes based on weather, and animated splash screen.
Responsive Design: Optimized for both desktop and mobile views with media queries and adaptive layouts.
Description: This project predicts passenger survival on the Titanic using ML models like Logistic Regression and Random Forest. Data preprocessing, feature engineering, and model evaluation are done using Python and scikit-learn.
Tools & Technologies:
Python Libraries: Used libraries like pandas, numpy, matplotlib, and seaborn for data loading, analysis, and visualization.
Scikit-learn Models: Applied Logistic Regression, Random Forest, and Decision Tree classifiers to predict passenger survival.
StandardScaler and LabelEncoder: Used for scaling numerical features and encoding categorical data such as gender and embarkation port.
Jupyter Notebook/Google Colab: Used as the development environment for writing, testing, and visualizing model performance.
Description: A machine learning model to recognize handwritten characters (A-Z, 0-9) using image processing and neural networks for text digitization and analysis.
Tools & Technologies:
TensorFlow/Keras :for building and training the CNN model for image classification.
Python libraries : Libaries like NumPy, Pandas, Matplotlib, and OpenCV for data manipulation, visualization, and image processing.
MNIST and a custom A-Z handwritten letter dataset: For training and evaluation
Description: This project aims to build a machine learning credit scoring model. It will predict loan repayment likelihood to help financial institutions make better lending decisions and minimize risks.
Tools & Technologies:
Python Libraries -libraries like pandas, NumPy, and scikit-learn for data handling, preprocessing, and machine learning.
Random Forest Classifier - an ensemble learning method for classification.
Evaluation Metrics -metrics like accuracy, classification report, and confusion matrix to assess model performance.
Description: This project focuses on developing a Smart City Multi-Agent Simulation using agentic AI frameworks. The goal is to simulate urban scenarios like traffic control, energy management, emergency response, and healthcare using intelligent agents that coordinate and make autonomous decisions to optimize city operations.
Tools & Technologies:
Python Libraries -Libraries like pandas, NumPy, and matplotlib for data handling and visualization.
Agentic AI Frameworks: Used AutoGen and CrewAI to design, build, and coordinate multiple specialized agents.
Technologies: Applied Generative AI, Machine Learning, and NLP techniques to enable agents to understand, learn, and respond dynamically.
Simulation & Evaluation: Tested multi-agent interactions in smart city scenarios and evaluated efficiency, coordination, and real-time responsiveness.
Description: This project uses ML models like Logistic Regression and Decision Tree to classify iris flowers into three species (Setosa, Versicolor, Virginica) based on sepal and petal measurements. Implemented in Python using scikit-learn.
Tools & Technologies:
Python Libraries: Used libraries like pandas, NumPy, matplotlib, seaborn, and scikit-learn for data handling, visualization, and machine learning.
Classification Models: Applied models like Logistic Regression, Decision Tree, and Random Forest for predicting iris species.
Evaluation Metrics: Used accuracy, classification report, and confusion matrix to measure model performance.
Jupyter Notebook/Google Colab: Used for writing, testing, and visualizing the model in an interactive environment.
Description: This project uses the XGBoost regression model to predict house prices based on features like area, location, number of rooms, etc. Built with Python, Pandas, and Scikit-learn.
Tools & Technologies:
Python Libraries: Used libraries like pandas, NumPy, matplotlib, and scikit-learn for data preprocessing, visualization, and model building.
XGBoost Regressor: Applied this powerful gradient boosting algorithm for accurate house price prediction.
Evaluation Metrics: Used metrics like RMSE (Root Mean Squared Error) and Rยฒ Score to evaluate model performance.
Jupyter Notebook/Google Colab: Used as the coding environment for implementation and testing.
Description:This script implements a Pix2Pix Generative Adversarial Network (GAN) for image-to-image translation, specifically for the "facades" dataset, aiming to generate architectural facades from input images.
Tools & Technologies:
Python Libraries:libraries like os, pathlib, time, datetime, matplotlib, and IPython.display for file handling, time tracking, plotting, and displaying outputs.
TensorFlow and Keras : used as the deep learning framework for building and training the Generator and Discriminator models.
Dataset : The "facades" dataset, accessed via a URL and tf.keras.utils.get_file, provides the training and testing images for the image translation task.
Description:This is a Markov Chain text generator. It learns word sequences from training text and then generates new text based on these learned probabilities.
Tools & Technologies:
Python:The primary programming language in which the Markov Chain text generator is implemented..
collections.defaultdict : A dictionary-like structure from Python's collections module used to store the Markov Chain model, where each key (a sequence of words) automatically gets an empty list as its value if the key is not yet present.
random : A built-in Python module used for introducing randomness in the text generation process, specifically for selecting the next word from the possible following words based on the learned probabilities in the Markov Chain.