• 08/17/2018 9:23 PM | Cara Karter (Administrator)

    Dear CEA Members,

    Welcome to our inaugural CEA Newsletter! We are excited to launch this quarterly newsletter to keep you informed about CEA events and interests. The CEA Executive Committee and Board have been busy! Our focus has been to rebuild CEA infrastructure and leadership by:  

    • Expanding our membership: Our CEA membership is the largest it has ever been and we welcomed our first ever institutional members, the AIDS Foundation of Chicago and the Education Development Center. In the coming year, we will be polling members to learn about your needs and vision for CEA programming and events. Members looking to get more involved can volunteer for our committees! Send an email to connect@evalchicago.org if you are interested in learning more.

    • Creating a fresh look: The Communications Committee has been hard at work! CEA has a brand new logo and branding. Check it out on our new website www.evalchicago.org.

    • Increasing programming: Led by our Professional Development Committee, CEA has hosted six events so far in 2018. These included presentations and workshops from evaluation leaders like David Fetterman, and Tom Lipsey as well as local evaluators like the St. Louis AEA affiliate’s Matt Feldmann and UIC’s Terry Ann Solomon. Our year continues with even more opportunities for professional development this month and next.

    • Utilizing Partnerships: CEA is leveraging our partnerships with The Evaluation Institute, Lurie Children’s Hospital, and Northwestern University to bring together our membership with a broader group of organizations and evaluators. We are also active members of AEA’s Local Affiliate Council which offers opportunities for us to network with other AEA affiliates.

    Please feel free to contact us at connect@evalchicago.org if you have any comments for CEA Leadership! I hope to see you soon at a CEA event!

    Best,
    Asma Ali

    Upcoming Events:

    Jazzin' at the Shedd with AEA President Leslie Goodyear

    Wednesday, 08/22/2018
    6:00 PM - 8:30 PM
    Shedd Aquarium, 1200 S Lake Shore Drive, Chicago, IL 60605

    The Chicagoland Evaluation Association is delighted to welcome AEA President Dr. Leslie Goodyear to our Annual Event at the Shedd Aquarium. Leslie will be speaking on the AEA 2018 Conference Theme, Speaking Truth to Power, the new Guiding Principles for Evaluators, and other AEA happenings. 

    Event Page: https://www.evalchicago.org/event-2816271

    Evaluation & Community Collaboration Conference

    Thursday, 08/30/2018
    9:00 AM - 3:30 PM
    Lurie Children's, 11th Floor Conference Center, 225 E Chicago Ave., Chicago, IL 60611

    Representatives from Northwestern, community agencies, and the CDPH to talk about their experiences working together on comprehensive HIV prevention demonstration projects and will include discussion about working together to improve our response to HIV.

    Event Page: https://www.evalchicago.org/event-3007373

    Becoming Evaluation Ready: Evaluation Training Session for Community Organizations

    Wednesday, 09/12/2018
    2:00 PM - 4:00 PM
    Lurie Children's, 11th Floor Conference Center, 225 E Chicago Ave., Chicago, IL 60611

    This session is designed for community organizations and will provide information on how to prepare for partnering with an evaluator and/or evaluating their programs.

    Event Page: 
    https://www.evalchicago.org/event-3007375

    The Chicago Evaluator: Member feature

    Because CEA members have such tremendous talent and experience to share with other members, we are featuring one member post in our quarterly newsletters. If you are interested in sharing your work with other members - email us at connect@evalchicago.org.

    How To Bring A Logic Model To Life

    and wield a MUCH more powerful evaluation tool

    Hi, Amelia Kohm of the Data Viz for Nonprofits here to talk about logic models. The evaluation world is lousy with logic models. You may know logic model by one of its other names such as causal chain, model of change, roadmap, and theory of change. A logic model is really just a humble flow chart with an erudite name. It’s a visual representation of how an intervention or program is supposed to work. And it should help evaluators articulate evaluation questions and select appropriate methods and measures to answer them.

    Lessons Learned

    • Logic models are hypothetical, best case scenarios. And, well, reality can bite.
    • Logic models often get more play during the planning and proposal-writing phase of a project than during implementation. During the daily work of a project, logic models are taking it easy, gathering dust in files and on servers.

    But what if we could plug a logic model into the real world? What if we could see how the plan is playing out in reality and make adjustments along the way?

    Cool Trick

    With data visualization software like Tableau, you can create a “living logic model. The current that animates it is real-time data. A living logic model compares theory to reality by showing progress to date. It also allows you to track the progress of subgroups and individuals. So it helps you to plan, to ask the right questions, and to make mid-course corrections.

    A living logic model is more understandable and tangible than a traditional one. The user can scroll over any component in the model to learn more about it. Such descriptions can include photos and web links for interested users. A living logic model shows progress to date. Color saturation indicates the status of each component. And the user can click on any component to see what subgroups might be driving progress, stagnation, or regression.

    Play around with this living logic model that I created for a tutoring program to get an idea of its potential. Scroll over components to get more information. Click on components to get data on individual students. Enjoy and please let me know if you have any questions.

    - Amelia Kohm, Founder and Consultant, Data Viz for Nonprofits


  • 08/15/2018 10:21 AM | Cara Karter (Administrator)
    Bridge Communities is seeking an experienced qualitative program evaluation consultant or organization to conduct oral history interviews with alumni of their transitional housing program. This work builds on a pilot and an ongoing larger survey evaluation of long term outcomes. Only consultants and firms/organizations with extensive experience conducting qualitative program evaluation studies will be considered.

    Approximate period of service: *October 2018 – September 2019
    Apply to: RFP@bridgecommunities.org
    Deadline for Application:* 5 pm, CST August 31, 2018

    Project Background:
    Bridge Communities is celebrating 30 years this fall 2018. We are committed to hearing how our programs have impacted the lives of our transitional housing program alumni and specific recommendations for improving Bridge Communities.  Bridge Communities serves families with dependent children that are homeless or at risk of becoming homeless and live or work in DuPage County, a suburban area of Chicago, Illinois. Over 90% of Bridge’s client families are headed by single mothers, approximately 55% of whom are survivors of domestic violence. The transitional housing program has developed over the last 30 years from a short term 3-month program to a 24-month program with potential for extensions. Supportive services have also grown from a financial literacy and mentoring model to also encompass employment services, adult education, children’s programming, nutrition, car donations, and mental health services.

    We launched a pilot retrospective study in the Spring of 2018 that includes an oral history project with a purposive sample of alumni and exhaustive efforts to survey as many alumni as possible. An estimated 850 families have participated in our transitional housing program over the last 30 years. As part of the pilot we developed recruitment resources and an interview protocol. Oral history interviews have been taking approximately
    60 minutes. Bridge Communities will provide the protocol and guide for the oral history interviews. The ability of the contractor to collaboratively review pilot lessons learned and contribute to process and tools is highly desirable.  We are hoping that a consultant group can complete data collection (interviewing approximately 30 additional alumni) and take the lead on analysis and reporting. Our Manager of Data and Evaluation, Susan Ryerson Espino, PhD, will be a collaborative partner to the retained consultant/firm and will oversee all data collection, analysis and reporting. Bridge Communities will cover the incentive for participants and transcription costs. All recordings, transcripts and work products associated with data collection, analysis and dissemination will be owned by Bridge Communities and stored on our secured sharepoint site.

    This is part of an evaluation study and we are interested in incorporating elements of the oral histories into dissemination efforts (staff development efforts, funding reports and proposals, conference presentations, and peer reviewed journal articles). We are open to exploring the need for and the expense of an external ethics (IRB) review.  Should this be recommended, the Contractor will facilitate submission of all relevant materials for local ethical approval.

    Timelines:
    • Work to begin October 1, 2018 and conclude by September 30, 2019
    • Sampling strategy finalized by October 15, 2018
    • Early review of pilot interview and drafting of coding framework by October 15, 2018
    • Initiate new data collection by October 15, 2018 and  conclude by March 2019
    • Written summary brief of progress, early lessons learned, and next steps by May 1, 2019
    • Written 20 page final report with orienting literature review to contextualize process and findings, methods, analytic strategy, results and discussion by September 30, 2019.
    Interested organizations are asked to submit a proposal that includes:

    • A technical narrative no longer than 5 pages describing how they will accomplish the tasks described above. Please include:
    • An institutional/organizational capacity statement, roles and responsibilities of all staff consultants, and summary of similar work conducted in the past 2 yearsProposed collaboration, data collection methods, data sharing, and protection of confidentiality, analysis approach including software solutions, quality control methods, and dissemination strategies.
    • Append a detailed budget including justification. Budget should include all costs associated with data collection field work and if needed costs to cover ethics review.
    • Append a project timeline with key activities and deliverables
    • Append CVs of key personnel who will be engaged in the data collection and analysis
    • Append certificate of insurance:  the limits and deductibles on your policy, who is insured (i.e., you or your business’s name as the named insured), the name of the insurance company issuing the policy, and the effective and expiration dates of your policy.
    • Append 2-3 references that can be contacted regarding the quality of the organization’s work
    • Append sample work product sample
    Complete proposals should be submitted to our system by 5pm (CST), August 31, 2018.

    All proposals will be reviewed for completeness, quality, and feasibility.
    The top 3 applicants will be invited by 9/17/18 to present their proposals to a selection committee composed of staff and constituents during the week of September 24, 2018. One applicant will be selected and notified by 9/28/18 and contracting and work will begin immediately.
  • 07/19/2018 11:41 AM | Cara Karter (Administrator)

    The Obama Foundation is hiring an Impact and Evaluation Senior Associate who will support the development of a learning culture within the Foundation and implementation of the Foundation’s Learning and Evaluation strategy. Reporting to the Impact and Evaluation Manager, the Senior Associate will design and implement evaluations of Foundation programs and initiatives​,  and serve as an evaluation expert and learning partner. 

    The ideal candidate has a proven track record designing and conducting evaluations and measurement systems, experience helping organizations use data for reflection and strategic refinement, experience collecting quantitative and qualitative data, and demonstrated interest and experience considering issues of marginalization and equity in evaluation. 

    ​Four or more years of relevant evaluation experience required. This position is open to all Foundation offices - Chicago, Washington D.C., and New York City. To apply please submit your application, including your resume and cover letter here. The Foundation is committed to creating a diverse environment and is proud to be an equal opportunity employer. We encourage individuals of all backgrounds to apply.

    About the Obama Foundation
    Founded in January 2014, the Obama Foundation is a living, working center for citizenship in the 21st century aimed at identifying, training, and connecting the next generation of leaders and engaged citizens. The Foundation is developing the Obama Presidential Center on the South Side of Chicago to serve as headquarters for the projects it will undertake across the city, the nation, and around the world. 

    As President Obama said in his farewell address, "I am asking you to believe. Not in my ability to bring about change — but in yours.” That concept is one the Obamas championed from the beginning, and it is now a cornerstone of the Obama Foundation’s efforts to support and develop the next generation of active citizens and emerging young leaders at home and around the world.

    Real change—big change—takes many years, and requires each generation to believe that its participation matters and embrace the obligations and opportunities that come with the most important office in a democracy: that of Citizen. Together, we have made extraordinary progress. Because there is more to do, this work lives on in the Obama Foundation.

    For more information on the Obama Foundation, please visit www.obama.org.
  • 07/16/2018 5:25 PM | Cara Karter (Administrator)

    Members of the Chicagoland Evaluation Association are invited to participate in Claremont Evaluation Center’s annual Professional Development Workshops. The annual workshop series provides working professionals and students with world-class practical and theoretical training in evaluation and applied research. This year’s series is scheduled for August 15-22, and will feature 19 seminars covering various topics in statistical analysis, culturally responsive evaluation, evaluation capacity building, and more.

    This longstanding series, taught by leading academics and seasoned practitioners, can be experienced onsite at Claremont Graduate University, or wherever you are, thanks to highly interactive online webcasts. To see a full list of the workshops and to register, go to the Claremont Evaluation Center website.

    For group rates and student discounts, please contact Omara Turner, omara.turner@cgu.edu, (909) 607-9013.
    Limited scholarships available.



  • 07/10/2018 1:24 PM | Cara Karter (Administrator)

    Event: Building Partnerships for Transformation: Wisconsin and African Evaluators + Leaders Planting Seeds for Change
    Date: July 23rd, 9am-12pm
    Location: Wisconsin Idea Room/159 Education Building 1000 Bascom Mall, Madison, WI 53706
    RSVP: http://evaluation.wildapricot.org/event-2984051

    You are invited to a meeting of ¡Milwaukee Evaluation! members with President Barack Obama's Young African Leaders Initiative (YALI) -- many of whom are engaged in evaluation and visiting Wisconsin for a limited time. This event is designed to exchange ideas and mix professional development with plenty of networking. 

    Professional development exchanges may include the following topics:

    • Wakanda evaluation: How can Africans and African Americans connect on evaluation?
    • Unpacking capitalism: How do we understand capitalism in our evaluation work?
    • Corruption as a barrier to evaluation
    • Feminist evaluation: How do men practice it?

    ¡Milwaukee Evaluation! is charging a small fee to cover refreshments and drinks but wnat to be responsive to everyone's diet needs and restrictions. If cost is an issue, please let them know.

    If you have questions, please contact Elise Ahn at elise.ahn@wisc.edu.



  • 07/09/2018 7:19 PM | Cara Karter (Administrator)

    Casey Family Programs is in search of a Director in their Research Services department. The Director of Research Services has responsibility for leading evaluation studies for the team within Research Services at Casey Family Programs. This individual will work with Casey staff and other agencies to evaluate the effectiveness of child welfare and related prevention, education, employment and mental health programs, products or tools.

    This role will develop and lead a team whose purpose is to plan and coordinate research or evaluation projects, including research design, data collection, data coding, data analyses and report writing. This role requires familiarity with state and county child and family welfare agency research capabilities and resources, as well as the ability to develop and evaluate standardized measures for evaluating the effectiveness of child welfare programs, products and tools.

    It would be desirable if the candidate has experience in one or more of these areas:

    • Economic analyses - with experience in conducting cost savings, benefit-cost, fund mapping, and other forms of financial analyses
    • Policy research methods and statistical analysis
    • Ability to access and analyze, on demand, a variety of cross-sector publicly-available data in a timely and competent fashion

    To review the full job description and apply for consideration, please submit resume and cover letter directly through the Casey Family Programs job application site: https://rew12.ultipro.com/CAS1011/JobBoard/JobDetails.aspx?__ID=*E668FDEA969343BB

    Questions should be directed to Heidi Tobaben at htobaben@casey.org

  • 06/16/2018 7:34 PM | Cara Karter (Administrator)

    CEA will host a featured week in the AEA365 blog in early August. We are looking to feature the exciting and innovative evaluation work of our CEA members. Topics can be related to CEA or your own evaluation work - anything that would be of interest to other evaluators across the nation!

    Posts (500 words or less) will be due by Monday, July 16th. AEA365 outlines guidelines for posts here

    Leah Neubauer and Asma Ali will be leading and co-editing the submission process. If you are interested in writing a posting, please email Leah (leah.neubauer@northwestern.edu) and Asma (asma.ali1@gmail.com) with your idea. 



  • 05/19/2018 5:29 PM | Anthony Heard (Administrator)

    Job Title: Evaluator / Coach

    As an Evaluator/ Coach with PIE Org, you will be responsible to help partner organizations achieve their evaluation goals. Includes external evaluation work (i.e., process, formative summative) and evaluation coaching to help organizations build internal evaluation capacity. Most evaluation work will be conducted in the areas of educational and social service programs. All evaluation work will utilize PIE’s research-based system to help partner organizations complete evaluation plans, data analysis, and evaluation reports, as well as make recommendations to client organizations’ senior management to improve their strategies and protocols. It is essential to develop and maintain good rapport with grantees and foundations with whom we partner. Strong communication and interpersonal skills are critical for this role. The ideal candidate must have the ability to work independently, multi-
    task, prioritize and expedite job responsibilities to complete work in a timely and high quality manner. Because much of our work is completed on-site in the neighborhoods where partner organizations are located, this position will work remotely a majority of the time. This is a full-time position and includes a substantial compensation and benefits package with health-care, retirement, and a generous vacation policy.

    Supervision: The Evaluator / Coach will directly report to the Evaluation Director, who will provide ongoing supervision and support, as needed.

    Principal Duties:
    • Develop and use logic models to describe complex programs and their outcomes.
    • Provide leadership in a team setting, move members forward and build consensus.
    • Work with stakeholders to develop a comprehensive strategic evaluation plan that prioritizes
    evaluation activities to be completed during regular funding periods.
    • Engage stakeholders in an evaluation process based on shared priorities, including meeting facilitation, presentation, conflict resolution, and negotiation skills.
    • Ensure that evaluation activities are complementary to program(s) strategic plans and reporting requirements.
    • Educate program staff and partners about evaluation concepts and methods.
    • Understand the context of a program and how it affects program planning, implementation, outcomes, and the evaluation.
    • Conduct formative and summative evaluations.
    • Knowledge in the development of evaluation plans and approaches for generating, revising, and prioritizing evaluation questions
    • Demonstrated use of various evaluation designs and methods (e.g., quasi-experimental, mixed methods). Ability to select appropriate quantitative or qualitative methodologies to increase use of findings by primary stakeholders.
    • Lead program’s staff in developing and testing data collection instruments.
    • Identify and assess existing data sources and literature for their potential use in program evaluation.
    • Gather data using qualitative and quantitative approaches such as interviews, group processes, participant observation, surveys, electronic data files, or other methods.
    • Construct databases, conduct and supervise data entry, and perform data cleaning.
    • Knowledge of methods for protecting confidential data.
    • Conduct analyses using appropriate analytic tools for quantitative data (e.g., SAS, SPSS) and/or qualitative data (e.g., Nvivo, Atlas.ti, MaxQDA).
    • Develop criteria and standards reflective of the values held by key evaluation stakeholders.
    • Synthesize information generated through an evaluation to produce findings that are clear and directly aligned to evaluation questions and programmatic outcomes.
    • Work with stakeholders to develop feasible recommendations based on evaluation data.
    • Prepare and present evaluation results in a manner that increases the likelihood that they will be used and accepted by a diverse group of stakeholders.
    • Develop and implement a communications and dissemination plan.

    Qualifications: Candidates who apply should have advanced skills in MS Excel and Microsoft Office Suite, advanced statistical and analytical skills, with 5 years of experience developing evaluation plans and measurement systems. A Master’s degree in applied statistics/research methods/social work/public health/education policy is required. Doctoral degrees preferred. Reliable transportation is a must.

    Statement on Inclusivity: PIE is committed to creating a diverse environment and is proud to be an equal opportunity employer.

    Anticipated hire date: Summer 2018

    To Apply: Please e-mail cover letter, resume, and work sample of a data analysis write up. Please send all documents as one pdf file with your last name as the file name, to admin@pieorg.org.

  • 04/10/2018 5:20 PM | Anthony Heard (Administrator)

    Webinar has passed, but CEA members only can access the recording. Email connect@evalchicago.org for the link.

    Speaker: Michael Quinn Patton, Ph.D. 
    Webinar title: Evaluation Science:How Knowledge is Validated, How the World is Changed, and How Nobel Prizes are Awarded.
    Presented by the the School of Social Science, Policy and Evaluation at Claremont Graduate University.

    In this “post-truth” era, conceptualizing evaluation as science and practitioners as evaluation scientists serves both science and evaluation. Both are evidence-based processes; when the credibility of scientific evidence is attacked, the credibility of evaluation evidence is threatened as well. Michael Quinn Patton, the founder and director of Utilization-Focused Evaluation, will present the case for treating evaluation as science and how that treatment complements but is different from treating evaluation as a profession/discipline.

    Patton’s Utilization-Focused Evaluation is based in Minnesota but works worldwide. Patton was a founding faculty member of The Evaluators’ Institute and holds the title of Professor of Practice in Claremont Graduate University’s School of Social Science, Policy & Evaluation. He is a former president of the American Evaluation Association and received the American Evaluation Association Award for Research on Evaluation in 2017. He previously received the Alva and Gunnar Myrdal Award for “outstanding contributions to evaluation use and practice” and the Paul F. Lazarsfeld Award for lifetime contributions to evaluation theory, both from the American Evaluation Association. He is the author of eight evaluation books, which have been used in over 500 universities worldwide.

    For more information, contact Linda Pillow, linda.pillow@cgu.edu
    909-607-1410. Since 2002, the John Stauffer Charitable Trust has sponsored a series of informative talks on current research in applied psychology for DBOS students, faculty, and the general community.

  • 03/13/2018 1:01 PM | Anthony Heard (Administrator)

    The Evaluators' Institute (#TEI) welcomes CEA members to participate in its April/May 2018 Chicago Program. Contact tei@cgu.edu to access the special CEA discount!

    (April 30 – May 5, 2018)

    The Evaluators’ Institute (TEI) offers top-tier training in evaluation for beginning, mid-career, and advanced evaluation professionals. Taught by leaders in the field, TEI courses offer technical rigor in evaluation methods and practice through participant-centered, adult-learning approaches.

    For participants who are interested in charting a full course of study, TEI offers a number of professional certifications.

    You can find course descriptions and information on how to register on the TEI website here: https://tei.cgu.edu/

    Don’t miss this great opportunity for CEA members to build professional skills alongside great faculty and fellow participants. Act now before the classes fill up!

    The TEI team is standing at tei@cgu.edu to answer any questions.

Connect with #evalchicago

Powered by Wild Apricot Membership Software