Jan Voldán

Founder

Prague, Czech Republic11 yrs 1 mo experience
AI EnabledAI ML Practitioner

Key Highlights

  • Founder of Vibecodiq, enhancing AI app production readiness.
  • Expert in data quality management across multiple industries.
  • Innovator in AI-safe architecture for scalable applications.
Stackforce AI infers this person is a SaaS and Fintech expert with a strong focus on data quality and AI architecture.

Contact

Skills

Core Skills

Software ArchitectureArtificial Intelligence (ai)Data WarehousingData Quality ManagementData IntegrationBusiness Intelligence

Other Skills

AI codebase auditsrecovery strategyAI-safe architecture boundariesproduction-readinessrisk assessmentAI architecturearchitecture designAI code structuringInformaticaTeradataJavaPythonInformatica Big Data ManagementJenkins pipelinesSQL

About

I help technical founders ship AI-generated SaaS apps without production disasters. If you're building with Cursor, Lovable, Bolt, or Replit and you've ever: - felt afraid to deploy because the codebase became unpredictable - watched one change break three unrelated features - discovered auth, billing, or admin holes after a customer found them - spent two weeks debugging what AI shipped in two days you're not alone. This is usually not a developer problem. It's a structural problem. I'm building Vibecodiq — a production-readiness scanner for AI-generated SaaS apps. It helps founders find auth, billing, admin, security, and architecture risks before they become customer problems. I'm also building ASA (AI-Safe Architecture), an open standard for structuring AI-generated codebases so they stay predictable, maintainable, and shippable as they grow. What I write about here: - architecture drift in AI-generated codebases - production-readiness for technical founders - auth, billing, and admin failure patterns - why AI apps slow down after month 3 - why structural enforcement beats “more QA” - the hidden cost of vibe coding past MVP If you're a founder, CTO, or indie hacker building SaaS with AI tools, follow along. I write in English for a global audience, mainly US / UK / EU. If you want to understand how your AI-built app actually behaves in production: → https://vibecodiq.com If you're interested in the underlying architecture principles: → https://asastandard.org

Experience

11 yrs 1 mo
Total Experience
1 yr 6 mos
Average Tenure
--
Current Experience

Vibecodiq

Founder @ Vibecodiq | Creator of ASA (AI Safe Architecture)

Oct 2025Present · 7 mos · Worldwide · Remote

  • I focus on solving an emerging challenge in modern software: unstable or difficult-to-maintain AI-generated applications.
  • With 20+ years in data engineering, data quality, and enterprise architecture, I've repeatedly seen the same pattern: without strict structure, systems eventually become fragile and hard to evolve.
  • My work focuses on AI codebase audits, recovery strategy, and designing AI-safe architecture boundaries to help teams transition from experimental AI coding to controlled AI engineering.
  • This work led to the development of ASA (AI Safe Architecture) — a deterministic architecture approach designed to make AI-driven systems safe, auditable, and maintainable by design.
AI codebase auditsrecovery strategyAI-safe architecture boundariesSoftware ArchitectureArtificial Intelligence (AI)

Csob

Data Specialist / DWH Consultant

Oct 2018Present · 7 yrs 7 mos · Prague, Czechia

  • Building Data Quality Management System from scratch.
  • Working within the large Rainbow project preparing DWH-based solution for Company Management of all KBC Group in 6 countries.
  • Tools used:
  • Informatica Big Data Management
  • Informatica Analyst Tool
  • UAC scheduling Tool
  • Jenkins pipelines
  • Teradata UDI Studio
  • Coding in Java, Python, Groovy
  • Intellij IDEA
  • Teradata SQL Assistant
  • Teradata Viewpoint and performance tuning
  • BTEQ - BTEQWin 16.20
  • Teradata Parallel Transporter
  • Oracle SQL Developer
  • Qlik Sense Data Analytics Platform
  • DBeaver 6.0.1
  • WinSCP
  • Putty
InformaticaTeradataJavaPythonData WarehousingData Quality Management

Msd global innovation center

Informatica Integration Platform Analyst

Nov 2016Apr 2018 · 1 yr 5 mos · Prague, Czechia

Roche

Informatica Consultant - Data Quality programme

Jul 2016Nov 2016 · 4 mos · London (WGC), England, United Kingdom

  • Data quality for budget forecasting project. Rewriting SQL rules into logical objects and IDQ rules. Creating profiles and scorecards. Building data validation and data standardization processes using reference data. Working with both Developer and Analyst tool. Importing IDQ rules as mapplets into PowerCenter.

Royal bank of scotland business

Informatica Consultant - Data Quality programme

May 2015Apr 2016 · 11 mos · Edinburgh, Scotland, United Kingdom

  • Project Background:
  • Working on a large Informatica Data Quality programme. Within this project I was working on the Data Profiling Service (DPS) and Data Quality Measurement System (DQMS). DPS is the single point of contact for data profiling requests and enables centralized data profiling allowing consistency and efficiency as data assets are measured, assessed and rationalized through effective data management. The DPS will assist the bank in analyzing and measuring the quality of the data. An initial assessment of the data quality followed by periodic measuring will allow insight into any data issues occurring so that corrective and preventive action can be taken. As part of the overall RBS Chief Data Office Data Management Programme, a DQMS is required to calculate and display Data Quality KPI scores. These will be based on the results of data quality rules applied to the Group Key Data Elements and input to the DQMS from DPS; Developed using Agile methodology.
  • Informatica MDM Multidomain Edition (ver 9.7) Experiences:
  • Installation and configuration of Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows
  • Design and configuration of landing tables, staging tables, base objects,
  • hierarchies, foreign-key relationships, lookups, query groups, queries and packages
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM
  • Creating Mappings, custom functions, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties to get the right master records
  • Creating and Execute Batch jobs and Batch Groups
  • Tools and Environments:
  • Informatica IDQ Developer 9.6.1 HotFix 2
  • Informatica PowerCenter 9.6.1 HotFix 2
  • Informatica PowerExchange 9.1.0
  • PowerCenter Web Service Hub 9.1.0
  • Red Hat Enterprise Linux Server 5
  • Oracle Database 11g Enterprise Edition Release 11.2.0.4.0
  • PL/SQL Release 11.2.0.4.0
  • Toda for Oracle Base 11
  • HP Server Automation Client
  • Quality Center 10.0
InformaticaSQLData QualityData Quality ManagementData Integration

Ing insurance / investment management

2 roles

Informatica Architect - PowerCenter Data Integration & Data Migration

Sep 2012Apr 2015 · 2 yrs 7 mos

  • ING Insurance has to comply with the requirements of Solvency II program and has to ensure the required compliance. Therefore systems under regional responsibility for the purposes of Solvency II reporting was implemented. The reporting data are provided in a timely (quarterly scheduled), documented, auditable and robust manner. ETL functional design used for reporting can be divided into logical steps which includes Statistics and metadata of ETL processes, Waiting for prepared source data, Integrate data from multiple source systems into the stage area, Transformation of data, Validations of data, Historization, Copying output data into target systems and Archiving and clean up. These requirements were fulfilled by an automated ETL system implemented in Informatica PowerCenter 9.5.1, Windows PowerShell Scripting, Oracle Database 11g.
Informatica MDMData QualityETLData Quality ManagementData Integration

Informatica Architect - Data Quality

Sep 2012Apr 2015 · 2 yrs 7 mos

  • The needs to comply with Solvency II and improve Data Management efficiency across the whole enterprise have been the drivers for my engagement in ING. I have identified main data transformations of reported data items, also mapped the supporting processes for asset, risk, liabilities reporting. From the Top-Down perspective the Key Data Items are identified along with their respective definitions and data quality rules. The identified data items are compiled into Data directories. Mapping and analysis of internal processes in ING was made in cooperation with data owners. The visual diagrams describing data flows in the company and data directory describing each data handover point were created. The data are profiled in Informatica Developer/Analyst Tool and data quality results are presented to the customer/stakeholders. This process ensures data quality monitoring of Key Data Items on regular basis. Technologies used: Informatica Data Quality 9.5.1, Oracle Database 11g.
Informatica PowerCenterETLOracleData IntegrationData Quality Management

Telefónica czech republic

Informatica PowerCenter - Data Integration Consultant

Sep 2011Sep 2012 · 1 yr · Prague, Czechia

  • Telefónica O2 is using Informatica not only for data warehouse but also as an integration platform for batch processing of data. However, regarding the gradual expansion of the use of Informatica as an universal integration tool the need to transfer data in real time using this platform has increased . Informatica Real-Time Edition extension was installed for this purpose and the required services of data transmission in real time was implemented. These are mainly the following ones: Transmission and display billing data, Generation and transmission of data for reporting (Customer Care, Ultra Broadband,...), Order tracking, Merging of real-time databases and others. Technologies used: Informatica 9.1.0. with RTE, Informatica 8.6.1.with RTE – WebService deployment, WebService calling, Java transformation, Perl, bash scripting, Solaris.
Informatica Data QualityOracleData Quality ManagementData Integration

Ge money bank, czech republic

DHW/BI Consultant

Jun 2009Aug 2011 · 2 yrs 2 mos · Prague, Czechia

  • Allocated for various challenging projects in GE Money Bank:
  • Business requirements analysis, BI solution analysis and design
  • Technical design
  • Leader of small implementation group
  • Building the Data Warehouse solution (Oracle), ETL implementation & optimization, SQL tuning
  • Some of my projects in GE Money Bank:
  • In scope of the project Variable Mortgage there was need to extend current data warehouse and related ETL processes because the new product Variable Mortgage. I did analysis and implementations for DataHUB and MIS systems. The purpose of Verification2NAS project was to improve verification processes of loan applicants.
  • The goal of the project FX Loan was to enable to create and administrate manually commercial EUR and USD loans in ICBS and all affected systems. The purpose was to allow commercial department to offer this products to clients. The project CX-Analytics integrated data stored in many different data stores. I was responsible for implementation of new DataHUB structures which was needed for data transfer from source systems. I was also responsible for data exports from MIS system.
  • Because the merger of GE Money Bank and GE Money Multiservis there was request to report OK/CEL products and credit cards into corporate system Actimize. I was responsible for analyzing, implementation of ETL reporting process and integration tests.
  • Informatica PowerCenter 8.6
  • Informatica PowerExchange 8.6
InformaticaJavaPerlData Integration

Gem system a.s.

Oracle PL/SQL, ETL Developer

Oct 2006Jun 2009 · 2 yrs 8 mos

  • Analysis, design and implementation of data warehouse & business intelligence
  • solutions for GEM System a.s. (Designing data model, building and optimizing ETL
  • processes, creating business reports). Technologies used: Oracle Database, Oracle Data Integrator, Enterprise Architect.
OracleETLSQLData WarehousingBusiness Intelligence

Education

Faculty of Electrical Engineering, Czech Technical University in Prague

Bachelor's degree — Computer Software Engineering

Jan 2003Jan 2006

The Secondary School of Electrical Engineering, Prague, 2002

Secondary school-leaving examination — Electronic computer systems

Jan 1998Jan 2002

Stackforce found 100+ more professionals with Software Architecture & Artificial Intelligence (ai)

Explore similar profiles based on matching skills and experience