News

AEA Conference Wrap-Up

News

Our evaluators are back from Minneapolis and have much to share about their conference experience!

The AEA conference in Minneapolis was a great experience. What I liked about the AEA is the ability to craft your own schedule.  In doing so, I tailored my agenda to include qualitative and quantitative methods.  Some of my favorite sessions were those that I felt provided me with a wealth of information that could be applied on the job back at CCNY.  This included “Getting Started with R” with data visualization specialist, David Keyes, “Advancing the Understanding of Measurement Issues in Behavioral Health” with a panel of various researchers from across the country, and “Data Design Planning” with evaluation consultant, Jennifer Lyons.  Although these were my favorite, each session I attended provided a good amount of information that I am eager to bring back and apply to CCNY.

—Meghan Santiago Harris, MPH

Attending AEA 2019 provided me with an important opportunity to connect with CCNY colleagues and internationally appreciated evaluation influencers, some of whom had a significant impact on my dissertation research and also on my development as a community social worker (in particular, David Chavis, CEO of Community Science). One of the best workshops included exploring the usefulness of Muppets, Mad Libs, and Bob Ross as creative tools for building consensus (and discovering I am a “Chaos” Muppet-type); and another “best” workshop was Evaluation Jeopardy that actively and energetically tested our knowledge and rewarded us (me and Jessica Tufte) with a $20 Starbucks gift card for coming in third place!

The daily plenaries were a pleasant surprise and I think they served to keep the conference participants connected and focused. In the future, I’d like to see an additional plenary on the last day of the conference – the day when connections start to fray and participants begin to give up on participation. Also, some of the most important workshops could not be attended due to being scheduled in very small rooms (despite some indications that they might be very well attended, i.e. Excel, R, and Tableau). Staying in an Airbnb allowed for some face-to-face contact with Minneapolis (within walking distance of a fabulous co-op) and an architecturally fascinating walk to and from the conference every day.

I’m already looking forward to AEA 2020.  I want a re-match on Evaluation Jeopardy!

—Sandra Sheppard, PhD, LMSW, CASACII

Spending four days among such a variety of evaluators -from over 20 countries and probably even more states- was a humbling and inspiring experience for me.  Sessions ranged from quantitative-based Research Control Trials to Evaluation Jeopardy and capacity-building with youth and children.  Unexpectedly, I found myself compelled to attend a few sessions led by indigenous evaluators who find themselves drawn between the world of Western-based evaluation practices and their own native culture.  From this, I gained perspective on how our own American culture shapes how we think about not only evaluation, but the world in general.  An example of this is how many of us don’t think about literal environmental/ ecological impact in our jobs. We complicitly, and many times unintentionally, isolate our human-ness from the rest of everything else.  Imagine if every proposal, report, and meeting in every field had a section about how the work at-hand impacts the environment and other people and living things’ well-being, for example.  On a more personal note, I’m so glad Sandy was there.  She fielded a number of questions at the end of our session about Implementation of Fidelity Tools with added perspective from her extensive previous professional experience in human services.  I’m thankful she was able to jump in so quickly with relevant knowledge and enthusiasm.

—Jessica Tufte, MPH

As a first-time presenter and attendee at AEA, I wasn’t quite sure what to expect, but I’m happy to report that the conference exceeded my expectations! The hardest part of the 4-day conference was choosing which sessions to attend, as there were so many interesting topics being presented concurrently. I personally found tremendous value in the sessions on qualitative methods, and being able to take away some helpful ways to both collect and analyze data.

Impossible to choose just one, my favorite sessions were Empathy as a tool for an inclusive future of evaluation“Hold on, that’s it?” A six-step approach for transforming qualitative analysis into a community-engaged, transparent process, and Empowering people with intellectual disabilities (ID) to have equitable health outcomes. It also was encouraging to meet others in the evaluation field embracing design thinking and human-centered design, evidenced by the ‘pop-up’ Design Loft that AEA ran throughout the day on Friday and the numerous sessions that focused on these concepts.

Once my Ignite presentation was out of the way on Friday, I was ready to do some exploring in the Twin Cities! To my delight, I was able to catch Grammy-award winner PJ Morton perform an amazing set at the Varsity Theatre and then grab a late-night bite at Spoon and Stable. Despite the chilly temperatures, I found Minneapolis to be a very warm and welcoming city. I returned to Buffalo feeling inspired, energized, and ready to try some new approaches in my evaluation work. 

—Jennifer McQuilkin, MS Arch.

Here’s to AEA 2020! 🥂

Join Us At The American Evaluation Association 2019 Conference

News

The time has finally come: we’re headed to the Hotdish Capital of the World tomorrow to present at the American Evaluation Association (AEA) Conference!

This year’s event will include more than 40 professional development workshops, feature 700 evaluation presenters, and connect us with a community of more than 3,000 evaluators.  We can’t wait to connect with our peers and learn from other evaluation professionals.  If you’re attending, please come say hi.  Don’t forget to check out our sessions and register if they interest you!

Below are our session (total of three) details:

Getting the Message Across: Effective Visual Reporting Summarizing the Evaluation of a Large K-12 NYS District’s New Sexual Education Program


Session Number: 1175
Track: PreK-12 Educational Evaluation
Session Type: Ignite
Tags: Healthy Relationship Skills, Human Centered Design, School District, Sex Education
Session Facilitator: Jennifer McQuilkin [Evaluation Associate – CCNY, Inc.]
First Author or Discussion Group Leader: Jennifer McQuilkin [Evaluation Associate – CCNY, Inc.]
Second Author or Discussion Group Leader: Jessica Tufte [Senior Evaluation Associate – CCNY, Inc.]
Third Author or Discussion Group Leader: Molly Ranahan [Evaluator – 3356034615]
Time: Nov 15, 2019 (11:30 AM – 12:15 PM)
Room: Hilton Minneapolis Ballroom Salon E – Ignite 30 (Friday, 11:40- 11:45 AM)


Audience Level: All Audiences


Session Abstract (150 words):

One-third of high school students had sexual intercourse in the last three months at a large NYS city school district. This previously abstinence-only district is implementing comprehensive healthy decision making curricula and services in order to better their students’ outcomes. Currently, the district has completed year two of implementation.

A nonprofit/consulting evaluation organization is in the midst of a four year evaluation assessing the effectiveness of the new curricula and all four program components: (1) revision of the district’s current health education curriculum, (2) hiring a professional health educator for students in grades 5 and 6, (3) creating a mobile health and portable dental clinic, and (4) engaging with community partners.

Session attendees will experience examples of highly visual reporting strategies used in this evaluation that get to the point so that all stakeholders quickly understand evaluation processes, the effectiveness of the curricula, and the importance of the program components being implemented.

Evaluators are Impacted: Reflecting on Racial Equity Work


Session Number: 2742
Track: Use and Influence of Evaluation
Session Type: Panel
Session Chair: Elisa Gallegos [Project Coordinator – SMU – CORE]
Discussant: Jennifer Grace LoPiccolo [Evaluation Specialist – Become: Center for Community Engagement and Social Change]
Presenter 1: Elisa Gallegos [Project Coordinator – SMU – CORE]
Presentation 1 Additional Author: Jessica Tufte [Senior Evaluation Associate – CCNY, Inc.]
Presentation 1 Additional Author: Miles McNall [Director – Michigan State University]
Presentation 1 Additional Author: Dodie L. Arnold, PhD, MSPH [Senior Evaluation Manager – Louisiana Public Health Institute]
Time: Nov 13, 2019 (04:30 PM – 05:30 PM)
Room: CC M100 F


Abstract 1 Title: Evaluators are Impacted: Reflecting on Racial Equity Work
Presentation Abstract 1:

Evaluators are called to guide initiatives through a process toward outcomes in an operational and measurable way. What is not always talked about is the effect of this process on evaluators. A group of evaluators partnering in racial equity initiatives throughout the country, through Truth, Racial Healing, and Transformation, explores the ways in which being an evaluation partner for initiatives in racial equity has influenced and changed them as evaluation professionals and as people. In discussing their experiences in racial equity work, the group seeks to bring about dialogue within and outside of the session (a) to inspire evaluators toward a space of contemplation about race that is embedded throughout their work, and (b) to encourage evaluators to maintain a dual focus on the evaluation work and their own internal assumptions and biases about race that affect the work.

Audience Level: All Audiences


Session Abstract (150 words):

Evaluators are called to guide initiatives through a process toward outcomes in an operational and measurable way. What is not always talked about is the effect of this process on evaluators. A group of evaluators partnering in racial equity initiatives throughout the country, through Truth, Racial Healing, and Transformation, explores the ways in which being an evaluation partner for initiatives in racial equity has influenced and changed them as evaluation professionals and as people. In discussing their experiences in racial equity work, the group seeks to bring about dialogue within and outside of the session (a) to inspire evaluators toward a space of contemplation about race that is embedded throughout their work, and (b) to encourage evaluators to maintain a dual focus on the evaluation work and their own internal assumptions and biases about race that affect the work.

Implementation Strategies for Fidelity-Measuring Tools in Preventative Family and Child Welfare Programs


Session Number: 1173
Track: Cluster, Multi-site and Multi-level Evaluation
Session Type: Roundtable
Tags: accelerating evaluation theory, practice, adaptive evaluation, child abuse and neglect prevention, Child Care Systems, child welfare, cluster, multi-site and multi-level eval, Fidelity Assessments, High Fidelity Wraparound, implementation, Survey Methodology, WFI-EZ
First Author or Discussion Group Leader: Jessica Tufte [Senior Evaluation Associate – CCNY, Inc.]
Second Author or Discussion Group Leader: Meghan Santiago Harris [Evaluation Associate – Community Connections of NY]
Third Author or Discussion Group Leader: Sandra Sheppard [Evaluation Associate]
Time: Nov 16, 2019 (11:15 AM – 12:00 PM)
Room: Hilton Symphony I


Audience Level: All Audiences


Session Abstract (150 words):

A nonprofit/consultant evaluation organization in Erie County, NY implemented and collected WFI-EZ (Wraparound Fidelity Index-EZ) surveys via three High Fidelity Wraparound (HFW) Care Coordination Agencies (CCAs) using a random stratified sample. After a literature review, evaluators engaged with the CCAs to encourage awareness and positive message framing around the surveying. Then the evaluation organization sent clients and team members paper surveys, while vendors and facilitators were emailed online survey links for completion. Response rates will be finalized before fall 2019.

Discussion questions:

  • What other evaluation methods/tools have attendees used to measure the fidelity of Evidence-based Interventions (EBIs)?
  • What challenges or successes occurred during these evaluations? What strategies helped overcome challenges? (Examples of challenges include follow-up, response rates, stakeholder buy-in, client engagement, etc.)
  • What are some strategies to engage stakeholders from all levels of an evaluation? Stakeholders may include the survey participants/community members, CCAs, other evaluators, or government funders.

Vikram’s PPV

News

In 2017, CCNY launched an employee wellness initiative called Paid Paid Vacation (PPV).  Every quarter, full-time staff here are entered into a draw for a chance to win $300 to be used towards something fun (as in, not bills) during a two-day vacation.  Those days are in addition to their usual paid time off.  This quarter, our Senior Data Analyst and resident Excel wizard, Vikram, won.

Here’s what he had to say about his recent timezone jump:

“I used my Paid Paid Vacation towards a trip to Denver, Colorado with my friends. The vacation started with a snowstorm, giving me a taste of the impending Buffalo winter.

We spent most of our time in the Rocky Mountains, enjoying hikes and wildlife-watching. On our last day, we took a detour to Garden of the Gods, a large sandstone formation that is a part of the excellent natural beauty and unique landscapes of Colorado.

It was my first trip to Denver and I thoroughly enjoyed it. Thank you so much CCNY for providing me this opportunity. I am so grateful.”

Click to read more about CCNY’s unlimited vacation policy and last quarter’s PPV winner.