Analytics and Data Warehouse Manager
Optimized data pipelines, reducing processing failures by 90% and improving monitoring mechanisms to ensure 99.9% uptime for reporting and analytics.
Enhanced ETL workflows, achieving a 80% reduction in processing time and cutting infrastructure costs by 35% while maintaining data integrity.
Balanced PL/SQL and Python, boosting query execution speeds by 70% and decreasing overall computational costs by 50%, improving performance for large-scale data processing.
About Analytics and Data Warehouse Manager
Highly skilled Data Analyst with a strong foundation in Python for data manipulation and automation, SQL for efficient database management and optimization, and Power BI for insightful data visualization and reporting. Proven ability to translate complex datasets into actionable insights, driving informed decision-making through comprehensive analysis and compelling visual representations. Eager to leverage these skills to contribute to a data-driven organization.
75689
Skills
Python
Extensively utilized Python for data analysis, cleaning, and processing tasks. Additionally, I leverage Python to automate data manipulation workflows and generate detailed reports, increasing efficiency and accuracy in various projects.
SQL
Utilized SQL for data cleaning and processing tasks which includes creating and managing stored procedures to streamline ETL processes. Managed and optimized key elements such as tables, views, and procedures, which are essential for the design and maintenance of the data warehouse I oversee. These elements also play a critical role in supporting reporting tools like Excel and Power BI
Power BI
Create and manage reports in Power BI tailored to user requirements. Developed value propositions to enable better understanding and analysis of company data. My responsibilities also include overseeing visual reports, OLAP connections, semantic models, and Power BI Web environment administration
Airflow
Used Airflow to orchestrate and schedule various data processing workflows,
ensuring seamless connection and synchronization between different environments
and databases
AWS S3
leverage Amazon S3 as a core component of our data architecture, harnessing its
virtually unlimited storage capacity to manage large volumes of raw source data
from diverse streams. By utilizing S3’s scalable design, I ensure our data is stored
with high durability and availability
AWS RDS
Leverage AWS RDS to manage and maintain PostgreSQL database instances, fully
capitalizing on its scalability and high-availability features. By automating critical
tasks such as backups, patch management, and updates, I uphold exceptional
standards of data integrity and security. Additionally, I proactively monitor
performance using Amazon CloudWatch and other specialized tools to swiftly
identify and address bottlenecks, optimizing overall database performance. This
comprehensive approach allows me to have robust solutions tailored to the
demanding needs of different environments like production, development, or testing