Data Analyst
Greenwood Village, CO, USA
Mark Sutfin
Castle Rock, Colorado 80104
mark.sutfin@gmail.com • www.linkedin.com/in/marksutfin
Skill (years)
Business Analyst (5)
Data Analyst (10)
Security Analyst (2)
Technical Writer(4)
HIPAA Officer(2)
QA / Application Testing (2)
Data Quality / Governance (2)
Python Scripting (3) - Python Web (<1)
Splunk Engineer (1+)
ETL (2)
SQL Developer (2)
Linux (5) | Windows (10) | Unix (2) | VM (6)
SDLC (10)
Digital Forensics (<1)
Certifications
• CEH | Security+ | Splunk Power User | Network+ | HIPPA Officer
Training
Ongoing
• Python for Data Science June 6– June 21 2018 (online / on-demand)
Meetups - weekly
• Agile – Plan the next Sprint
• Python
Community Volunteer
• ECCouncil - CEH exam beta tester
• Pearson Publishing - Technical product reviewer / proof reader of Industry / Professional books
Personality / Temperament
• I take after Sherlock Holmes rather than Dr. Watson.
• I balance asking difficult questions with common sense and timing
• I recognize when things are broken and I fix them.
• I own my work.
• I'm marketable - internally because of my integrity and externally because I bring respect to your brand.
Strengths
• Expert process flow analysis and documentation skills. (Ask me how I saved an employer 100K/yr)
• Strong interpersonal skills - ability to interact with people at all levels of the organization
• Forensic-oriented
• Ability to work cross-functionally – adept with conflict resolution
• Advanced communication skills allow presenting technical ideas to non-technical people and connecting techies with business needs and user requirements
• Ability to handle multiple tasks simultaneously - proceed with minimal supervision
• Quickly learn new responsibilities, functions, systems and software
• Self-motivated, proactive, iterative, incremental and evolutionary
• Whatever, Wherever, Whenever attitude
Business Analysis / Technical Writing (samples available)
• Competent requirements elicitation techniques (observation, survey, brainstorm…)
• Translate business concepts to technical implementations
•Exceptional at discovering the ‘WHY’ beneath the ‘WHAT’ and helping users steer clear of ‘HOW’
• Experienced in determining the right areas for improvement (cost/resource based analysis)
• Identify and document scope changes and perform impact analysis
• Advanced skills for reading, writing, and interpreting technical documentation
• Assist in or provide department/team and end-user training
• Work with end users to develop UI (screen mockups / wireframe)
• Develop and maintain all technical documentation. ( samples available upon request)
• Technical expertise regarding data models, database design and use cases, user stories
• Scrum / Agile
Data Analysis
• Design and implement data platforms and processes to capture, integrate, analyze and distribute information in the enterprise
• Identify strategy, tools required, design and implement solutions for data ingestion, storage, processing and provisioning using data platform
• Legacy system conversion, data migration, database conversion and data mining
• Strong analytical skills to collect, organize, analyze, and disseminate significant amounts of information
• Troubleshoot system issues to identify potential system/process improvement
• Identify, analyze, correlate and interpret trends or patterns in complex data sets
• Experience using standard tools to acquire production data or manually create test data
• Data onboarding with inline data conversion and cleanup
Data/Process/Application Quality Assurance
• Hands-on QA and UAT testing of modifications/new systems with defect tracking/reporting
• Accomplished at identifying where data and program errors are introduced
• Tools/techniques for assessing and eliminating root causes of data failures
• Define data quality rules, policies and procedures
• Implement continuous oversight/auditing of data quality compliance
• Understand the importance of data accuracy and completeness; timeliness in providing access to the data; consistency and uniqueness of data within and across data sets
ETL
• Familiar with ETL tools (Pentaho, Talend) and reporting packages (Tableau, Splunk)
• Oversee data extraction, transformation and load from variety of data stores
• Assist in the design and implementation of Enterprise Data Warehouse
• Maintain detailed documentation for data warehouse and procedures
• Experienced in data integration, migration, data profiling, validation, verification and cleansing
• Perform logical to physical data mapping
SQL
• select, update, insert, delete
• sort, limiting output (is/is not null, ifnull, coalesce), and/or, distinct
• inner/outer joins, union
• temporary tables
• tables/columns/indexes – create, alter, drop, rename
• constraints – not null, unique, PK, FK, check, default
• index creation / auto increment
• date, aggregation and SQL functions
• hands-on development of procs, UDFs, triggers, views
• user accounts, connection parms (interactive, batch), preference/option files, environment vars
• import/export/load external files/data
• vendor package install/upgrade/patch maintenance
• performance tuning, query plans, optimizer
Python
• build from source
• network and data analysis related code
• strings, numbers, variables, objects, type conversion, precedence
• lists, tuples, dictionaries and sets
• code structures (while, if, for), iterables, comprehensions
• functions (nested), closures, generators and decorators
• comparators, logical and conditional operators
• filehandles, file manipulation, file tests
• modules, packages and programs
• python best practices
• regular expressions (advanced)
• pycharm, repl, vim, json
Python Web
• User management, including secure password handling, logins, user profiles and avatars
• Database management and database migration support
• Handling of user input via web forms
• Pagination of long lists of items
• Full-text search
• Email notifications to users
• HTML templates
• Working with dates and times
• Internationalization and localization
• Installation on a production server
• Working with Docker containers
• Application Programming Interfaces
• Push notifications
• Background jobs
Cyber Security
• Familiar with PICERL phases of incident management
• Splunk ES, Incident Review, Splunk 6.5.2
• DDoS mitigation with Arbor (Peakview), managed object configuration
• Comprehensive static data analysis (syslog and windows event logs)
• Python scripting and forensics / advanced regular expressions
Footprinting and Recon - Robtex, Google dorking, SmartWhois, traceroute, ping, nslookup
Network Scanning – nmap, tcpdump
Vulnerability Assessment - metasploit
Enumeration – nbtstat, dig
Packet Analysis | Sniffing – wireshark, tshark, ncat
Alert Logic, HPNA, whois, dig, shodan, qualys, remedy, samhain
Splunk
• Splunk ES for incident response
• Basics of splunk API and splunk SDK
• Create searches, reports, dashboards, charts, alerts and apps based on business requirements
• Data acquisition / onboarding using Splunk Web
• Ability to write regex to perform field extractions at search time
• Experience writing Splunk queries to create Splunk dashboards / reports
Mark Sutfin
Castle Rock, Colorado 80104
mark.sutfin@gmail.com • www.linkedin.com/in/marksutfin
Skill (years)
Business Analyst (5)
Data Analyst (10)
Security Analyst (2)
Technical Writer(4)
HIPAA Officer(2)
QA / Application Testing (2)
Data Quality / Governance (2)
Python Scripting (3) - Python Web (<1)
Splunk Engineer (1+)
ETL (2)
SQL Developer (2)
Linux (5) | Windows (10) | Unix (2) | VM (6)
SDLC (10)
Digital Forensics (<1)
Certifications
• CEH | Security+ | Splunk Power User | Network+ | HIPPA Officer
Training
Ongoing
• Python for Data Science June 6– June 21 2018 (online / on-demand)
Meetups - weekly
• Agile – Plan the next Sprint
• Python
Community Volunteer
• ECCouncil - CEH exam beta tester
• Pearson Publishing - Technical product reviewer / proof reader of Industry / Professional books
Personality / Temperament
• I take after Sherlock Holmes rather than Dr. Watson.
• I balance asking difficult questions with common sense and timing
• I recognize when things are broken and I fix them.
• I own my work.
• I'm marketable - internally because of my integrity and externally because I bring respect to your brand.
Strengths
• Expert process flow analysis and documentation skills. (Ask me how I saved an employer 100K/yr)
• Strong interpersonal skills - ability to interact with people at all levels of the organization
• Forensic-oriented
• Ability to work cross-functionally – adept with conflict resolution
• Advanced communication skills allow presenting technical ideas to non-technical people and connecting techies with business needs and user requirements
• Ability to handle multiple tasks simultaneously - proceed with minimal supervision
• Quickly learn new responsibilities, functions, systems and software
• Self-motivated, proactive, iterative, incremental and evolutionary
• Whatever, Wherever, Whenever attitude
Business Analysis / Technical Writing (samples available)
• Competent requirements elicitation techniques (observation, survey, brainstorm…)
• Translate business concepts to technical implementations
•Exceptional at discovering the ‘WHY’ beneath the ‘WHAT’ and helping users steer clear of ‘HOW’
• Experienced in determining the right areas for improvement (cost/resource based analysis)
• Identify and document scope changes and perform impact analysis
• Advanced skills for reading, writing, and interpreting technical documentation
• Assist in or provide department/team and end-user training
• Work with end users to develop UI (screen mockups / wireframe)
• Develop and maintain all technical documentation. ( samples available upon request)
• Technical expertise regarding data models, database design and use cases, user stories
• Scrum / Agile
Data Analysis
• Design and implement data platforms and processes to capture, integrate, analyze and distribute information in the enterprise
• Identify strategy, tools required, design and implement solutions for data ingestion, storage, processing and provisioning using data platform
• Legacy system conversion, data migration, database conversion and data mining
• Strong analytical skills to collect, organize, analyze, and disseminate significant amounts of information
• Troubleshoot system issues to identify potential system/process improvement
• Identify, analyze, correlate and interpret trends or patterns in complex data sets
• Experience using standard tools to acquire production data or manually create test data
• Data onboarding with inline data conversion and cleanup
Data/Process/Application Quality Assurance
• Hands-on QA and UAT testing of modifications/new systems with defect tracking/reporting
• Accomplished at identifying where data and program errors are introduced
• Tools/techniques for assessing and eliminating root causes of data failures
• Define data quality rules, policies and procedures
• Implement continuous oversight/auditing of data quality compliance
• Understand the importance of data accuracy and completeness; timeliness in providing access to the data; consistency and uniqueness of data within and across data sets
ETL
• Familiar with ETL tools (Pentaho, Talend) and reporting packages (Tableau, Splunk)
• Oversee data extraction, transformation and load from variety of data stores
• Assist in the design and implementation of Enterprise Data Warehouse
• Maintain detailed documentation for data warehouse and procedures
• Experienced in data integration, migration, data profiling, validation, verification and cleansing
• Perform logical to physical data mapping
SQL
• select, update, insert, delete
• sort, limiting output (is/is not null, ifnull, coalesce), and/or, distinct
• inner/outer joins, union
• temporary tables
• tables/columns/indexes – create, alter, drop, rename
• constraints – not null, unique, PK, FK, check, default
• index creation / auto increment
• date, aggregation and SQL functions
• hands-on development of procs, UDFs, triggers, views
• user accounts, connection parms (interactive, batch), preference/option files, environment vars
• import/export/load external files/data
• vendor package install/upgrade/patch maintenance
• performance tuning, query plans, optimizer
Python
• build from source
• network and data analysis related code
• strings, numbers, variables, objects, type conversion, precedence
• lists, tuples, dictionaries and sets
• code structures (while, if, for), iterables, comprehensions
• functions (nested), closures, generators and decorators
• comparators, logical and conditional operators
• filehandles, file manipulation, file tests
• modules, packages and programs
• python best practices
• regular expressions (advanced)
• pycharm, repl, vim, json
Python Web
• User management, including secure password handling, logins, user profiles and avatars
• Database management and database migration support
• Handling of user input via web forms
• Pagination of long lists of items
• Full-text search
• Email notifications to users
• HTML templates
• Working with dates and times
• Internationalization and localization
• Installation on a production server
• Working with Docker containers
• Application Programming Interfaces
• Push notifications
• Background jobs
Cyber Security
• Familiar with PICERL phases of incident management
• Splunk ES, Incident Review, Splunk 6.5.2
• DDoS mitigation with Arbor (Peakview), managed object configuration
• Comprehensive static data analysis (syslog and windows event logs)
• Python scripting and forensics / advanced regular expressions
Footprinting and Recon - Robtex, Google dorking, SmartWhois, traceroute, ping, nslookup
Network Scanning – nmap, tcpdump
Vulnerability Assessment - metasploit
Enumeration – nbtstat, dig
Packet Analysis | Sniffing – wireshark, tshark, ncat
Alert Logic, HPNA, whois, dig, shodan, qualys, remedy, samhain
Splunk
• Splunk ES for incident response
• Basics of splunk API and splunk SDK
• Create searches, reports, dashboards, charts, alerts and apps based on business requirements
• Data acquisition / onboarding using Splunk Web
• Ability to write regex to perform field extractions at search time
• Experience writing Splunk queries to create Splunk dashboards / reports