Data analyst
no image
Data analyst


Location: NJ | Mobile: ------------ | Email: D------------
Summary

• 4+ years of IT experience in Data Analyst, business intelligence, software development, data-driven applications, statistical analysis and predictive modeling to derive business decisions.
• Experience in all phase of SDLC like Requirement Analysis, Implementation and Maintenance and good experience with Agile and Waterfall.
• Experience in developing different Statistical, Text Analytics, Data Mining solutions to various business generating and problems data visualizations using Python and Tableau Design and development of Data warehouse for Business Intelligence data marts and reporting.
• Proficient in developing different types of Tableau Reports and Tableau Dashboards, working at the data model to fit Tableau needs and discovering data by Tableau to best answer the business questions.
• Good knowledge with OLTP/OLAP System and E-R modeling, developing database, dimensional and multidimensional modeling.
• Worked on several python packages like NumPy, pandas, PyTables, Seaborn, Matplotlib and Scikit-learn.
• Experience in working with various Python Integrated Development Environments like PyCharm, Anaconda and Eclipse.
• Data Analysis-Data collection, data transformation and data loading the data using different ETL systems like SSIS.
• Experienced in creating filters, quick filters, data source filters global filters, context filters, user filters, actions and creating Dashboards for key performance indicators (KPI).
• Good experience in Text Analytics, generating data visualizations using Python and creating dashboards using tools like Tableau.
• Experience in developing and analyzing data models, involved in writing simple and complex SQL queries to extract data from the database for data analysis and testing.
• Good understanding of working on Artificial Neural Networks models using Tensor Flow packages using in Python.
• Hands on experience in Stored Procedures, Functions, Triggers and strong experience in writing complex queries, using SQL Server and MySQL.
• Good knowledge of tracking system such as JIRA and version control tools such as Git.
• Understanding of Data Warehousing principles like Fact Tables, Dimensional Tables, Dimensional Data modelling - Star Schema and Snow Flake Schema.
• Good analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all.



Education

Master of Science in Computer and Information Science
New York Institute of Technology, Old Westbury, NY

Bachelor of Engineering in Computer Engineering
Gujarat Technological University, India




Certificate
PH526x: Using Python for Research [CERTIFICATE ID: dc------------b9096ca4baa498fb84d]



Methodologies: Agile, Waterfall, SDLC
Reporting Tools: Tableau, Tableau Server
ETL Tools: SSAS, SSIS
Databases: SQL Server, MS-SQL Server, MS Access
Tools: MS Project, JIRA, Git
IDE: PyCharm, Anaconda, Eclipse
Languages: Python, HTML5, CSS3, JavaScript, SQL
Packages: NumPy, Pandas, PyTables, Seaborn, Matplotlib, Scikit-learn
Operating System: Windows, Linux
Documentation Tools: MS Office, Microsoft Visio, KPI

Experience

Liberty Mutual Insurance, NY | Aug 2018 - Current
Roles: Data Analyst
Responsibilities:

• Worked in Agile environment, with an ability to accommodate and test the newly proposed changes at any point of time during the release.
• Application of various algorithms and statistical modeling like decision trees, regression models, neural networks, SVM, clustering to identify Volume using scikit-learn package in python.
• Optimization of data sources for route distribution analytics dashboard in Tableau reducing report runtime.
• Used Pandas, NumPy, macros SciPy, Matplotlib and Scikit-learn in Python for developing various Machine Learning algorithms.
• Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP and OLAP.
• Imported the customer data into Python using Pandas libraries and performed data analysis - found patterns in data which helped to make key decisions for the company.
• Create customize SQL Queries using MySQL Management Studio to pull specified data for analysis and report building in conjunction with Crystal Reports.
• Utilized Tableau server to publish and share the reports with the business users.
• Manage Departmental Reporting systems, troubleshooting daily issues, and integrating existing Access databases with numerous external data source including (SQL, Excel, & Access).
• Worked extensively with MS Visio and Rational tools to document requirements.
• Wrote several Teradata SQL Queries using SQL Assistant for Ad Hoc Data Pull request.
• Used JIRA and Git for version control in the project management.
• Defined project milestones, schedules, monitoring progress and updating plans using MS Project.
• Extract the data from legacy system and loaded/integrated into another database through ETL process.









Kpit Technologies, India | May 2015 – July 2017
Roles: Data Analyst
Responsibilities:

• Waterfall methodology was used throughout the project and had daily scrum meeting and bi-weekly sprint planning and backlog meeting.
• Imported the customer data into Python using Pandas libraries and performed various data analysis - found patterns in data which helped in key decisions for the company.
• Performed exploratory data analysis on the aggregated data using Python libraries for finding missing data and outliers.
• Built Artificial Neural Network using TensorFlow in Python to identify the customer's probability of cancelling the connections.
• Performed Data Collection, Data Cleaning, Validation, Visualization and developed strategic uses of data.
• Used different features of Tableau to create filter and interactivity based on user requirement Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
• Generated various capacity planning reports (graphical) using Python packages like NumPy, Matplotlib and SciPy.
• Developed Data Migration and Cleansing rules for the Integration Architecture.
• Experience in using SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, and process SSAS cubes to store data to OLAP databases.
• Created UML diagrams including context, Business Rules Flow, and Class Diagrams.
• Managed worksheets using manual and automated process on Tableau development server.
• Utilized Tableau and custom SQL feature to create dashboards and identify correlation.
• Involved in Data Mapping activities for the data warehouse.
• Worked on MS SQL server management studio for creating DB & writing scripts for analysis and filtering.
• Hands on Experience on Pivot tables, Graphs in MS Excel, MS Access.