Udruga "Ecco Ars" Mostar

Community Data, Culture, Accountability

Impact & Reports

Measured public value, reported with clarity.

This page brings together outcome metrics, reporting priorities, annual progress snapshots, and field evidence from the programs led by Udruga "Ecco Ars" Mostar across culture, civic participation, and community accountability.

4,260 Residents, students, and volunteers engaged in 2025
63 Community recommendations submitted to institutions
78% Program milestones delivered on approved schedule
86% Active projects with documented outcome tracking
Reporting focus

How impact is translated into accountability

Reports combine attendance data, budget allocation, partner verification, and participant testimony so funders, residents, and public institutions can assess what changed, how resources were used, and where delivery needs correction.

The reporting cycle is structured around quarterly reviews, annual public summaries, and project-close evidence files that document outputs, outcomes, risks, and follow-up actions.

1. Report Library

Three reporting streams that shape decisions

Each report type serves a different audience, but all of them use the same evidence logic: verified activity data, partner review, and a plain-language summary of what the numbers mean in practice.

Annual Impact Report

A board-level and public-facing summary covering reach, budget use, partner contributions, milestone delivery, and major lessons from the year’s portfolio.

Audience: Public and funders Cadence: Annual Format: Narrative + dashboard

Quarterly Delivery Review

An operational report focused on schedule variance, recruitment, participation quality, implementation risks, and the corrective actions agreed with project leads.

Audience: Staff and board Cadence: Quarterly Format: KPI review

Project Evidence File

A close-out package documenting participant data, procurement notes, communications outputs, and partner signoff for a specific intervention or grant cycle.

Audience: Due diligence teams Cadence: End of project Format: Annex set
2. Outcome Stories

Field evidence paired with measurable outcomes

The strongest reporting is not only numerical. It connects the figures to program decisions, participant pathways, and the visible changes they support in schools, neighborhoods, and community institutions.

Youth participation shifted from attendance to retention

Tracking moved beyond sign-in sheets to follow whether young participants stayed through a full cycle, returned for advanced sessions, and referred peers into the next cohort.

Completion improved from 49% to 83% Peer referrals increased across 3 schools Mentor logs validated retention gains

“The reporting now shows not only who joined, but who stayed, who progressed, and what support actually made the difference.”

Program coordination note

Resident proposals became trackable civic outputs

Forum documentation was redesigned so each recommendation could be logged, grouped by issue, and followed through institutional response points rather than disappearing into meeting minutes.

126 resident proposals structured for review Mobility and youth-space issues dominated
3. Reporting Timeline

How reporting practice matured over time

What began as project documentation for cultural activities has developed into a broader reporting system that links community programs to governance, financial transparency, and public-interest communication.

2015

Program attendance became a formal baseline tool

Participant sign-ins and completion records were standardized for recurring youth and arts activities, creating the first usable comparative dataset across program cycles.

2019

Budget and procurement notes entered the reporting frame

Project reports were expanded to include spending categories, supplier rationale, and implementation notes, improving donor confidence and internal review discipline.

2023

Partner verification and field interviews were added

Quantitative outputs were paired with staff interviews, partner confirmations, and participant testimony to reduce over-reporting and sharpen lesson capture.

2025

Quarterly dashboards began guiding management decisions

Delivery variance, risk flags, and outcome movement are now reviewed in a recurring format, allowing earlier intervention when a program falls behind or evidence quality drops.

4. Indicator Table

Core indicators tracked across impact reporting

The organization’s indicator set mixes service-delivery metrics with quality and governance signals, allowing reports to show both scale and accountability.

Indicator Why it matters 2025 snapshot Evidence source
Participant completion rate Shows whether people stay long enough to receive the intended value of a program. 83% in cohort-based youth programs Attendance logs and endline registers
Resident recommendation count Tracks whether dialogue work results in concrete civic outputs. 63 recommendations advanced through formal channels Proposal registry and meeting minutes
Schedule adherence Tests operational reliability and flags implementation bottlenecks early. 78% of milestones delivered on time Quarterly workplans and variance review
Outcome-tracked portfolio share Measures how much of the organization’s work is backed by explicit monitoring. 86% of active projects Project dashboard and board review pack
5. Resource Use

How financial reporting connects spending to public benefit

Impact reports remain credible only when program results are presented alongside transparent information on delivery costs, support functions, and the choices behind resource allocation.

Annual resource distribution is reviewed against outcomes, not only against budget lines.

Program spending is analyzed in relation to engagement, completion, partnership activity, and follow-up outcomes. This prevents financial summaries from becoming detached from the actual value created in communities.

74% Share of 2025 income directed to program delivery
18 Active municipal and institutional partnerships sustained
11 Schools and youth hubs served through recurring activity
92% Participant satisfaction across tracked cohort programs

Financial transparency supports delivery decisions

When cost categories, procurement notes, and implementation results are reviewed together, the organization can shift resources toward activities that show stronger participation, clearer outcomes, and better delivery reliability.

6. Visual Field Notes

Reporting is strengthened by field context

Images are used to document program environments, participant interaction, and the practical settings in which outcomes are produced. They do not replace evidence, but they help situate it.

Planning and review sessions

Internal and partner workshops are used to align indicators, clarify evidence standards, and capture corrective actions before the next reporting cycle begins.

Community-facing delivery moments

Documentation reflects where engagement happens: neighborhood forums, local events, youth hubs, and the informal spaces where participation becomes visible.

Reflection and evidence capture

After-action reflection is part of the reporting discipline, especially when results vary across cohorts or when implementation barriers need to be documented clearly.