Project Outcomes
Project Outcomes
List any important outcomes or findings not previously reported:
To measure the impact of this project, we included
evaluation surveys each time a Mix Box was delivered. At
first, the evaluation forms were on a flash drive included in
the delivery. When we received feedback that the flash drive
system wasn't as helpful as we intended, we switched to a
shared Dropbox link sent via e-mail. Out of the 23 libraries
that held at least one Mix Box program, we received 16
evaluation surveys (a visual representation is included in the
media items). We asked for feedback on our web site, our
scheduling and delivery, the contents of the Mix Boxes, and
the programs held. Of these respondents, 94% rated the
MKE Mixers web site highly (4 out of 5 or 5 on a 5-point
scale) for logic and ease of navigation, professional design,
and usability. For packaging and delivery, 93.8% reported
their Mix Box arrived on time for their program, and all
respondents indicated that repacking the Mix Box was easy
to some degree. Library staff used the Mix Boxes for a
variety of ages, with the majority being for children and
teenagers, and 88% reported that their program attendees
were very engaged (5 out of 5 on a 5-point scale).
We also tracked usage data via our Google request e-mail
and calendar to see how often each Mix Box was being
requested and how many libraries were making the
requests. Between August 2016 and August 2017, the Mix
Boxes were circulated 70 times (each check-out lasted 2-4
weeks), with 1-5 Mix Boxes being sent out weekly. Our
busiest months thus far were March-April 2017 and June-
August 2017, with 7-10 individual requests. The Mix Boxes
were used in libraries across Milwaukee County, with 23 out of 27 Milwaukee County libraries making use of the service.
The most used Mix Boxes were the Play Mix, the Craft Mix,
and the Circuits Mix.
With this data, we can see how the initial project was
received and glean insight for improvement and growth in
the future. We believe the reception of our project has been
strong enough to request further support from Milwaukee
County library directors in order to expand our most popular
Mixes, create a fund for replacement items, and purchase
additional technology to enhance the usability of the AV Mix,
Play Mix, and Brick Mix.
Please briefly describe the importance of these outcomes and findings for future program planning:
Explain one or two of the most significant lessons learned for others wanting to adopt any facets of this project:
Have an enthusiastic team with a shared goal and
administrative/director support.
Our team was developed out of belief that our project would
fill a need for greater access to 21st learning tools and
technology in our library system, which resulted in our
ability to keep the momentum of our project going despite
delays, complications, and our individual work schedules.
We all wanted the project to succeed and were willing to
work hard both collaboratively and independently. Team
members were able to switch out tasks and areas of
responsibility when needed and put in more or less time on
the project each month based on individual work schedules.
Support from our individual library directors (such as
allowing paid work time to be spent on this project in
addition to regular job duties) and from the MCFLS
administrative staff (such as their willingness to physically
move Mix Boxes to the delivery area and back into storage
and then check over contents) was instrumental in giving us
time to plan and implement this project.
Have a responsive system to disseminate information.
We initially used flash drives included inside each Mix Box to
disseminate program ideas, instruction manuals, diagrams,
and so on. This proved harder to implement than we’d
thought: once Mix Boxes were circulating, flash drives were
sometimes left out of the bins, and it was difficult to recall
flash drives to make changes or add content. Also, feedback
from Milwaukee County librarians indicated that they wanted
the materials sooner, often weeks before they would receive
a Mix Box for a program, to better plan and promote their
programs. We settled on a shared DropBox instead, with an
e-mail and password set up that we can administer. We send out the login and password in the e-mails we use to
confirm requests. This is much easier to update and
provides immediate access to librarians requesting a Mix
Box, and they can continue to use the shared materials at
any time. Finally, the DropBox has potential for further
information-sharing improvements to our project that we are
considering, such as allowing librarians to upload
evaluations and share their own program materials, instead
of or in addition to doing so on our web site.
Have input from multiple sources on important procedures.
The details of how best to store, package, reserve, and
circulate large numbers of craft tools, technology and AV
equipment, and other physical items was a major portion of
our project. So was protecting our financial investments. We
used many of the same companies that have current
accounts with our library system, such as Action Logistics
delivery services and Uline products. Having input from the
system director, system administrators, and multiple library
directors during the planning stages, especially in writing
procedures, meant that concerns were addressed up front,
rather than months into the project necessitating big
changes. System administrators requested inventory tagging
on large and/or expensive items, labeling Mix Boxes with an
alphabetical system, and locking all containers during
delivery; while library directors were concerned with their
individual library’s financial responsibility in the event of loss
or damage.
Evaluations are more effective when targeted directly to
participants in their preferred format.
We included three kinds of evaluation forms with our
circulating Mix Boxes. There was a link to a Google form for
library staff to complete, a printable form for adult program
attendees, and a printable form for child program attendees.
These evaluations were included on the flash drives at first
and, later, in the DropBox. We intended for library staff to
print evaluation forms to hand out to the people at their
programs, to be collected and sent to us either with the Mix
Box when returned or scanned and e-mailed. We also
intended for library staff to complete their own online
evaluation of the MKE Mixers service for each program they
did. In practice, however, this didn’t happen: library staff
rarely sent us formal evaluations of any kind, preferring to
e-mail suggestions and feedback to the MKE Mixers e-mail
directly. However, when we prompted library staff to do an
evaluation by sending out an e-mail request with the link to
the Google form included, we had much better results. To
better evaluate our project as it continues, we will need to
send requests for feedback periodically, rather than leave it
to library staff to remember on their own.
Do you anticipate continuing this project after the current reporting period ends:
Yes
Do you anticipate any change in level of effort in managing this project:
Yes
Explain:
The MKE Mixers team feels strongly about advocating for
programs that teach STEM/STEAM skills, technological
competencies, collaboration, resilience, and all kinds of
creation. LSTA grant funding made it possible for us to
create this service and fill a need in the Milwaukee County
library community, and it has always been our intention to
continue circulating and updating the Mix Boxes after the
end of the funding period. With the strong reception we
have received from Milwaukee County library staff and the
additional support from MCFLS staff in administering the
project (storage and delivery), we intend to focus on
ensuring sustainability by further refining our scheduling
procedure, writing new policies, recruiting new staff, adding
to our online resources (program guides, etc.), and securing
a small but steady source of yearly funding.
Do you anticipate changing the types of activities and objectives addressed by the project:
No
Explain:
Was an evaluation conducted for this project:
Yes
Was a final written evaluation report produced:
No
Can the final written evaluation report be shared publicly on the IMLS website:
No
Was the evaluation conducted by project staff (either SLAA or local library) or by a third-party evaluator:
Third-Party
What data collection tools were used for any report outcomes and outputs:
Did you collect any media for the data:
What types of methods were used to analyze collected data:
Other:
How were participants (or items) selected:
What type of research design did you use to compare the value for any reported output or outcome: