Sprint 2: MVP v2
“Would it save you a lot of time if I just gave up and went mad now?”
― Douglas Adams, The Hitchhiker’s Guide to the Galaxy
Marketing and Sales Strategy
The marketing strategy for Rizu focuses on introducing the platform to organizations seeking a simplified approach to private cloud management through OpenStack. The target audience consists mainly of small and medium-sized enterprises, educational institutions, and IT departments that need accessible cloud tools without the complexity of traditional interfaces. This audience has been selected because of its increasing interest in adopting cloud-based infrastructure and its demand for solutions that reduce training and operational barriers.
Promotion will rely on digital channels aligned with professional and technical user habits, including campaigns on LinkedIn and participation in online communities dedicated to cloud technologies. Demonstration workshops and technical webinars will be used to showcase Rizu’s features and build trust through direct interaction. Strategic collaborations with local technology providers will help extend market reach and create referral networks.
Sales will follow a business-to-business model supported by direct outreach and an online trial system to facilitate product adoption. The financial plan includes projected costs for digital marketing, event participation, and content production, ensuring consistency between the promotional activities and the overall financial structure.
Finance
Pre-operations Budget
The following table presents the global pre-operation budget for the execution of the project.
This budget covers all necessary expenses for the development, testing, and initial deployment of the OpenStack alternative dashboard, including personnel remuneration, required hardware and software, and dissemination of results.
Currency format
All values are expressed in Colombian pesos (COP).
Items | Sources | Total | |
---|---|---|---|
Own (Our Team) | Other Sources | ||
Personnel | $4,000,000.00 | $2,000,000.00 (University Support Program) | $6,000,000.00 |
Equipment | $3,500,000.00 | $1,000,000.00 (Faculty Lab Access) | $4,500,000.00 |
Software | $1,200,000.00 | $0.00 | $1,200,000.00 |
Materials | $600,000.00 | $0.00 | $600,000.00 |
Field Visits | $800,000.00 | $400,000.00 (OpenInfra Meetup Travel Fund) | $1,200,000.00 |
Bibliographic Material | $300,000.00 | $0.00 | $300,000.00 |
Publications, Patents, or Software Registration | $500,000.00 | $500,000.00 (Institutional Research Fund) | $1,000,000.00 |
Technical Services | $700,000.00 | $0.00 | $700,000.00 |
Travel | $400,000.00 | $200,000.00 (Collaboration Grant) | $600,000.00 |
Total | $12,000,000.00 | $4,100,000.00 | $16,100,000.00 |
Operations Budget
Once the pre-operation budget has been established, it is necessary to define the operational budget, estimating both fixed and variable costs. The following tables present the estimated monthly and per-client costs for maintaining the project.
Fixed Costs | |
Category | Monthly Expense (COP) |
---|---|
Developer Stipends (4 members) | $3,200,000.00 |
Server Maintenance (University Infrastructure) | $500,000.00 |
Software Subscriptions (GitHub, IDEs, APIs) | $300,000.00 |
Domain and Hosting Services | $150,000.00 |
Electricity and Connectivity | $250,000.00 |
Administrative and Communication Expenses | $200,000.00 |
Total Fixed Costs | $4,600,000.00 |
Variable Costs | |
Category | Cost per Client or Deployment (COP) |
---|---|
Cloud Hosting (per client deployment) | $120,000.00 |
Data Transfer and API Usage Fees | $60,000.00 |
Technical Support and Maintenance | $80,000.00 |
Testing and CI/CD Pipeline Resources | $40,000.00 |
Payment Processing and Transaction Fees | $20,000.00 |
Total Estimated Variable Cost (per client) | $320,000.00 |
The operational budget reflects a modest academic-scale setup using institutional infrastructure and open-source tools, anticipating limited operational costs should the project evolve into serving external users.
First Operational Year
This section presents the projected income and expenses for the first 12 months of operation for the project. Estimates assume gradual adoption by users through university research groups, academic institutions, and potential industry partnerships.
Products and Services Offered
Product or Service | Estimated Price per Client or Deployment (COP) |
---|---|
Premium Dashboard Subscription (per client/month) | $180,000.00 |
Custom Module Development (per contract) | $800,000.00 |
Technical Support and Maintenance Package (monthly) | $250,000.00 |
API Integration Service (per deployment) | $400,000.00 |
Training and Implementation Workshop (per session) | $600,000.00 |
Financial Projection
Year 1
Month | Estimated Clients or Sales |
Income | Expenses |
---|---|---|---|
Month 1 | 2 | $360,000.00 | $4,920,000.00 |
Month 2 | 3 | $540,000.00 | $4,920,000.00 |
Month 3 | 4 | $720,000.00 | $4,920,000.00 |
Month 4 | 5 | $900,000.00 | $4,920,000.00 |
Month 5 | 6 | $1,080,000.00 | $4,920,000.00 |
Month 6 | 7 | $1,260,000.00 | $4,920,000.00 |
Month 7 | 9 | $1,620,000.00 | $5,000,000.00 |
Month 8 | 11 | $1,980,000.00 | $5,000,000.00 |
Month 9 | 13 | $2,340,000.00 | $5,000,000.00 |
Month 10 | 15 | $2,700,000.00 | $5,000,000.00 |
Month 11 | 18 | $3,240,000.00 | $5,000,000.00 |
Month 12 | 20 | $3,600,000.00 | $5,000,000.00 |
Total Year 1 | 113 | $20,340,000.00 | $59,520,000.00 |
Year 2
Month | Estimated Clients or Sales |
Income | Expenses |
---|---|---|---|
Month 13 | 22 | $3,960,000.00 | $5,500,000.00 |
Month 14 | 24 | $4,320,000.00 | $5,500,000.00 |
Month 15 | 26 | $4,680,000.00 | $5,500,000.00 |
Month 16 | 28 | $5,040,000.00 | $5,500,000.00 |
Month 17 | 30 | $5,400,000.00 | $5,600,000.00 |
Month 18 | 32 | $5,760,000.00 | $5,600,000.00 |
Month 19 | 34 | $6,120,000.00 | $5,700,000.00 |
Month 20 | 36 | $6,480,000.00 | $5,700,000.00 |
Month 21 | 38 | $6,840,000.00 | $5,800,000.00 |
Month 22 | 40 | $7,200,000.00 | $5,800,000.00 |
Month 23 | 43 | $7,740,000.00 | $5,900,000.00 |
Month 24 | 46 | $8,280,000.00 | $6,000,000.00 |
Total Year 2 | 399 | $72,720,000.00 | $67,100,000.00 |
Year 3
Month | Estimated Clients or Sales |
Income | Expenses |
---|---|---|---|
Month 25 | 48 | $8,640,000.00 | $6,200,000.00 |
Month 26 | 50 | $9,000,000.00 | $6,200,000.00 |
Month 27 | 52 | $9,360,000.00 | $6,200,000.00 |
Month 28 | 54 | $9,720,000.00 | $6,200,000.00 |
Month 29 | 56 | $10,080,000.00 | $6,200,000.00 |
Month 30 | 58 | $10,440,000.00 | $6,300,000.00 |
Month 31 | 60 | $10,800,000.00 | $6,300,000.00 |
Month 32 | 62 | $11,160,000.00 | $6,400,000.00 |
Month 33 | 64 | $11,520,000.00 | $6,400,000.00 |
Month 34 | 66 | $11,880,000.00 | $6,400,000.00 |
Month 35 | 68 | $12,240,000.00 | $6,500,000.00 |
Month 36 | 70 | $12,600,000.00 | $6,500,000.00 |
Total Year 3 | 708 | $128,440,000.00 | $75,000,000.00 |
Summary of Totals
Year | Total Clients/Sales | Total Income (COP) | Total Expenses (COP) | Net Result |
---|---|---|---|---|
Year 1 | 113 | $20,340,000.00 | $59,520,000.00 | - $39,180,000.00 |
Year 2 | 399 | $72,720,000.00 | $67,100,000.00 | + $5,620,000.00 |
Year 3 | 708 | $128,440,000.00 | $75,000,000.00 | + $53,440,000.00 |
Balance and Break-even Point
Usability Tests Protocol
The main objective of the usability tests is to evaluate the usability of the Rizu web console to ensure an effective, efficient, and satisfactory experience for end users managing OpenStack resources. The tests will validate key user flows in the MVP v2 interface and identify friction points affecting comprehension, navigation, and task completion.
Test 1: Account Creation and Login
Task: Create a new account and log into the platform.
Hypothesis: Users will be able to sign up and log in without requiring external guidance. Field labeling and validation feedback will be sufficient to guide completion.
Metrics:
- Time to complete (minutes)
- Success rate (%)
- Number of input errors
- Post-task satisfaction (1–5 scale)
Research Question: Can new users easily understand the registration and authentication process?
Test 2: Project Selection and Navigation
Task: Access the dashboard after login and select an existing project to manage.
Hypothesis: The project list and selection mechanism are intuitive, with no ambiguity regarding where to start managing cloud resources.
Metrics:
- Time to locate and select project
- Success rate (%)
- Navigation errors (wrong clicks, confusion events)
- User-reported clarity of layout (1–5)
Research Question: Do users understand how to locate and access a specific project from the dashboard?
Test 3: Network and Router Creation
Task: Create a network and attach a router using the provided interface.
Hypothesis: The configuration flow for network and router creation follows a logical sequence and requires minimal technical prior knowledge.
Metrics:
- Task completion time
- Number of retries/errors
- Success rate (%)
- User confidence rating (1–5)
Research Question: Can users successfully configure networking components without prior OpenStack experience?
Test 4: Virtual Machine Deployment
Task: Launch a new virtual machine (instance) within a selected project and network.
Hypothesis: The VM creation wizard provides clear guidance and prevents misconfiguration through validation and defaults.
Metrics:
- Time to launch instance
- Error count (e.g., missing fields, invalid selections)
- Completion rate (%)
- Satisfaction level (1–5)
Research Question: Do users understand how to deploy a VM and verify its status after creation?
Test 5: Overall Workflow Comprehension
Task: Perform a complete end-to-end flow: login → select project → create network → launch VM → verify deployment.
Hypothesis: Users can complete the full workflow with limited confusion and minimal external reference.
Metrics:
- Total time for full flow
- Task success rate (%)
- Total number of errors
- Overall satisfaction (System Usability Scale or 1–5 average)
Research Question: Does the integrated workflow provide a coherent and seamless experience across tasks?
Measurement and Reporting
All tests will be conducted using screen recording and direct observation. Quantitative metrics will be complemented with short qualitative interviews post-session to identify usability pain points and improvement priorities.
Automated Testing
The automated testing strategy ensures that all delivered functionalities in the Rizu web console are validated for correctness, internal consistency, and user flow reliability. The current scope of automated testing focuses exclusively on Rizu’s own application layer (views, forms, routing, and permission logic) without interacting directly with live OpenStack APIs. The goal is to maintain code stability and predictable behavior across releases by verifying that all implemented views and components respond correctly to expected and edge-case inputs.
Automated tests are executed continuously as part of the development workflow using pytest and pytest-django. Each push or pull request triggers the test suite, ensuring regressions are caught before merging into main. Coverage centers on the application’s authentication, project management, and dashboard modules, which together represent the primary user interaction paths.
Test Coverage Matrix
Functionality | Type of Test | Rationale |
---|---|---|
User Authentication (Login, Registration) | Unit and Integration Tests | Authentication is a critical entry point. Unit tests validate form handling, field validation, and credential verification, while integration tests confirm end-to-end request/response behavior and redirection logic. |
Project Creation and Access Control | Integration and Scenario Tests | Project-level permissions and roles (manager vs. user) require validation across multiple components (views, permissions, templates). Scenario tests confirm correct restriction and authorization behavior. |
Implementation Notes
- The current automated suite consists mainly of unit and integration tests built with pytest-django.
- Unit tests focus on individual views, forms, and permission decorators, validating template usage, form fields, and error handling.
- Integration tests verify that views and templates work together, checking for correct redirections, access control, and context rendering.
- Existing unit tests (as shown) cover all authentication, registration, and dashboard-related views.
- The test suite runs automatically in CI for every commit, maintaining baseline assurance that recent changes have not broken authentication, routing, or permission logic.
- Scenario and E2E tests will be added using tools such as pytest-django to simulate real user workflows (login → project → network → VM).