Chapter+4+Notes

Chapter 4

Particularly Appropriate Qualitative Applications Here are my notes on chapter 4 as promised in an attachment. I am still typing notes will give updated notes by 9/22/11

The chapter on particularly appropriate qualitative applications discusses different qualitative methods inquiries; on what people do, what they think, and what they know. This process is done through interviews and observing documents. Qualitative methods are not appropriate for every inquiry. However, the aim of this chapter is to illustrate when it may be appropriate to use particular qualitative methods. Chapter 4 overviews how qualitative methods can contribute to the use of evaluation, problem solving, real world decision-making, action research, policy analysis, and organizational or community development. By offering examples of appropriate methods of research and evaluation questions aimed at helping the researchers to decide on methods of interviewing people, doing fieldwork, constructing case studies, using a qualitative method in practical applications. By doing so, one can discover the value of in-depth, open-ended inquiry into people’s perspectives and experiences. Appropriate qualitative applications emerge from the power of observation, openness to the world to learn and inductive analysis to make sense out of the world’s lesson. Practical applications have a few basic and simple ideas, which are; paying close attention, listen and watch, be open, think about what you hear and see, document systematically, and apply what you learn. Studies that focus on quality applications are as follows:


 * Understanding and illuminating quality**
 * Quality assurance**_ systematic monitoring and evaluation of various aspects of the production process


 * Evaluation applications**
 * Outcome evaluations**_ provide information on how well your program is accomplishing its goals


 * Evaluating individualized outcomes**_ matching program services and treatments to the needs of individual clients, (highly individualized programs operate under the assumption that outcomes will differ from client to client) for example, the text refers to a model of education processes that assumes that the outcome of education for each child are unique


 * Process studies** _ are aimed at clarifying and understanding the internal dynamics of how a program, organization, or relationship operates. Focuses on a process that involves looking at how something happen rather than or in addition to examining outputs and outcomes. Focus on the relationship of what we do is more important than how we did it. The study of process evaluation can, 1) focus on what things people experience which makes this program what it is? 2) How our clients brought into the program, 3) how they move through the program after they become participants? 4) how is what people do relate to what they’re trying to do? 5) What are the strengths and weaknesses of the program from the perspective of participants and staff?


 * Implementation evaluation**_ tells decision makers what is going on in the program, how the program has developed, and how and why programs deviate from initial plan and expectations. A decision maker usually makes sure that a policy is being put into operation in a according to its design or to test the feasibility of a policy. For example, implementation evaluation answers the following kinds of questions; 1) what do clients in the program experience? 2) what services are provided to clients? 3) what does the staff do? 4) what is like to be in the program? 5) how is the program organized?


 * Logic models and theories of action**_ depicts, the connection between program inputs, activities and processes, outputs immediate outcomes, and long-term impacts. Logical models portray a reasonable, defensible, and sequential order from inputs through activities to outputs, outcomes, and impacts. Action models are explanatory and predictive.


 * Avaluability assessments_ conducted through interviews, document analysis, an observation to determine whether a program is capable of implementation of a formal and rigorous evaluation.** This process involves making sure program treatment models are clearly identifiable and logical; that outcomes are clear, specific, and evaluable; and that implementation strategies are reasonably and logically related to expected outcomes.


 * Comparing programs_** focus on diversity_ programs differ from place to place because places are different. Individualizing service to clients has been one of the major things to social action and educational programs. (Behavioral, humanistic, and developmental centers) see text 4.3


 * Prevention evaluation**_ the user designs for evaluating prevention programs use experimental control groups.


 * Documenting development over time and investigating system changes**_ process-oriented approaches to facilitate change.


 * Evaluation models**
 * Goal-free evaluation**_ focus on intended services and outcomes. Measures the extent to which a program or intervention has attained clear and specific objectives. Gives evaluators structure and support. Structures certain methodological decisions, offer guidance about appropriate steps to follow in design provide direction when dealing with stakeholders. Doing fieldwork and gathering data on a broad array of effects or outcomes, then comparing the data with the actual needs of program participants. There are 4 primary reasons for making goals-free- evaluations; 1) avoid the risk of narrowing studies by missing important unanticipated outcomes; 2) remove the negative connotations attached to the findings; 3) eliminate the perceptual biases introduced into evaluation; 4) maintain evaluators independence


 * Transaction models_** responsive illuminative evaluation emphases on context and interpretation, along with 1) identifying issues and concerns based on direct, face-to-face contact with people in and around program; 2) Uses program documents to further identify important issues; 3) direct, personal observations of program before formally designing the evaluation to increases evaluators understanding; 4) designing the evaluation based on the issues that emerged in the preceding three steps; 5) report information, 6) match information reports and report formats.


 * Connoisseurship studies**_ places the evaluator’s perceptions and expertise at the center of the evaluation process.


 * Utilization-focused evaluation**_ offers an evaluated process, strategy, and framework for making decisions about the content, focus, and methods of an evaluation. Utilization-focused evaluation begins with identification an organization of specific, relevant decision makers and information users.


 * Interactive and participatory applications**_ practical and pragmatic forms of inquiry, the researcher is especially sensitive to the perspectives of others and interacts closely with them in designing and/or implementing the study. These studies congruent with qualitative methods include efforts to personalize and humanize research and evaluation, working with stakeholders to harmonize program and evaluation values, acts in learning and reflective practice, appreciative inquiry, facilitating collaboration with coresearchers


 * Personalizing and humanizing evaluation**_ particularly important for education, therapy, and development efforts that are based on humanistic values. This application is a put people first a program that is based on humanistic concerns and principals. Emphasis especially in advocating that the perspectives of the participants be given primacy.


 * Harmonizing program and evaluation values**_ the final design of an evaluation depends on calculated trade-offs and weighing options, including political/philosophical/value considerations of its strengths and weaknesses of its strengths and weaknesses in relation to both values and technical factors.


 * Developmental applications: action research, action learning, reflective practice, and learning organizations**_ these problem solving and learning oriented processes often use qualitative inquiry and case study approaches to help a group of people reflect on ways of improving what they are doing or understand its new ways.


 * Appreciative inquiry**_ a popular organizational development approach that emphasizes building on an organization’s assets rather than focusing on problems or even problem solving. It is a theory, a mindset, and an approach to analyze that these two organizational learning and creativity.


 * Participatory research and evaluation: valuing and facilitating collaboration**_ participatory action research encourages joint collaboration within a mutually acceptable ethical framework to understand and/or solve organizational or community problems. Principal researchers train the coresearchers to observe, interview, reflect, and/or keep careful records or diaries. Collaborative inquiry processes can have an impact beyond the findings generated from a particular studies.


 * Supporting demographic dialogue and deliberation**_ qualitative methods are especially acceptable to an understandable by non researchers, case studies can be an excellent resource for supporting inclusion and dialogue. Democratic evaluator’s recognizes and supports value pluralism with the consequences that the evaluator’s should seek to represent the full range of interests in the course of designing an evaluation. A parallel in reinforcing the use of evaluation focuses on helping people learn to think and reason evaluatively, how rendering help can contribute to strengthening democracy over the long term.


 * Supporting democracy through process use: helping the citizenry weigh evidence and think evaluatively**_ democrat evaluations are to reframe the policy analyst’s function from an emphasis on generating expect judgments to an the emphasis on supporting informed dialogue, including methodological dialogue. Helping people learn to think evaluatively is a process used. As relating to and being indicated by individual changes in thinking and behaving occurs among those involved as it relates to the learning curve standard valuation process. By 1) more informed electorate to the use of finding 2) more thoughtful deliberative citizenry though helping people learn to think and engage each other evaluatively. For example, the values of evaluations include clarity, specificity, and focusing; being systematic and making assumptions explicit; operationalizing program concepts, ideas, and goals etc.… Separating statements of fact from interpretations and judgments.

Special applications
 * Unobtrusive measures**_ Unobtrusive measures are measures that don't require the researcher to intrude in the research context.


 * State-of-the-art considerations: lack of proven qualitative instrumentation**
 * Confirmatory and elucidating research: adding depth, detail, and meaning to qualitative analyses**


 * Rapid reconnaissance**


 * Capturing and communicating stories**


 * Legislative monitoring and auditing**


 * Futuring applications: Anticipatory research and perspective policy analysis**


 * Breaking the routing: Generating new insights.**