Digital image exploration at Maui Community College
Katie M. Morzinski, Christopher J. Crockett, Ian J. Crossfield
aa r X i v : . [ phy s i c s . e d - ph ] S e p **Volume Title**ASP Conference Series, Vol. **Volume Number****Author** c (cid:13) **Copyright Year** Astronomical Society of the Pacific Digital image exploration at Maui Community College
Katie M. Morzinski , , Christopher J. Crockett , , and Ian J. Crossfield Center for Adaptive Optics, University of California at Santa Cruz,1156 High St., Santa Cruz, CA 95064 Astronomy Dept., University of California at Santa Cruz, 1156 High St.,Santa Cruz, CA 95064 Astronomy Dept., University of California at Los Angeles, 430 Portola Plaza,Box 951547, Los Angeles, CA 90095 Lowell Observatory, 1400 W. Mars Hill Rd., Flagsta ff , AZ 86001 Abstract.
We designed a two-day laboratory exploration of fundamental conceptsin digital images for an introductory engineering course at Maui Community College.Our objective was for the students to understand spatial vs. brightness resolution, stan-dard file formats, image tradeo ff s, and the engineering design cycle. We used openinvestigation, question generation, and an engineering design challenge to help our stu-dents achieve these learning goals. We also experimented with incorporating Hawaiianlanguage and cultural awareness into our activity. We present our method, student re-sponse, and reflections on the success of our design. The 2008 re-design of this activityfocused on better incorporating authentic engineering process skills, and on using arubric for summative assessment of the students’ poster presentations. A single filecontaining all documents and presentations used in this lesson is available online .
1. Introduction
In inquiry-style laboratory activities, students learn science by performing science (Dow et al.2000). Keys to inquiry are ownership of students over their learning and authenticityof the activity to real-life science and engineering practices (Ash & Kluger-Bell 1999).Here we discuss an engineering inquiry on Digital Image Files we developed under theauspices of the Professional Development Program (PDP). The PDP is a unique educa-tional program that trains science, technology, engineering, and math (STEM) graduatestudents to teach science and engineering while simultaneously promoting STEM ed-ucation at the undergraduate level and for historically underrepresented populations(Hunter et al. 2008). The PDP originated as part of the education theme of the NationalScience Foundation Center for Adaptive Optics (CfAO), and has now transformed tobecome a major component of the Institute for Scientist and Engineer Educators (ISEE,Hunter et al., this volume).
2. Activity Description2.1. Venue Background
ISEE is a key player in the Akamai Workforce Initiative (AWI), a consortium that isdeveloping education and employment opportunities for residents of the Hawaiian is-lands. The Hawaiian word akamai translates to clever , and the goals of AWI are todevelop e ff ective teaching in post-secondary schools in Hawai‘i, train local students forMaui-based careers in the technology industry, increase the representation of womenand Native Hawaiians in Hawai‘i-based employment, and build partnerships betweenhigh-tech educators and employers on Maui.AWI encompasses internships, community programs, electro-optics certification,and curriculum development. Curricula developed by the Teaching and CurriculumCollaborative (TeCC) have provided support for creating a Bachelor’s degree in Ap-plied Science in Engineering Technology at Maui Community College (MCC), allow-ing the school to seek accreditation as a four-year college—the University of Hawai‘i,Maui College. Toward this goal, in Fall 2008, three TeCC teams were invited to designcurricula for a new course, Electronics 102: Instrumentation , taught by MCC professorMark Ho ff man. The TeCC teams designed three inquiries covering aspects of instru-mentation: CCDs (Mostafanezhad et al., this volume), Spectroscopy, and Digital ImageFiles. This paper describes the Digital Images inquiry. To plan this activity we first decided what we wanted the students to get out of theexperience. We had four types of goals for the students: content, process, attitudinal,and CfAO programmatic goals. Our goals are summarized briefly in Table 1.
Table 1. The learner goals we set out as we began the activity-planning process.
Content Goals Process Goals
Pictures can be represented by numbers Defining a problemPixels and arrays Proposing a solutionContinuous vs. discrete Communicating in writingNumber of pixels and spatial resolution Evaluating tradeo ff sBit depth and color resolution Solving a problem with constraintsRelation between file size and resolution Carrying out engineering processImage file manipulationImage file formats and header information Attitudinal Goals CfAO Program Goals
Solving a problem in a team Drawing on prior knowledgeBeing creative Observing and communicatingMaking predictions Gaining career preparationComfort in solving an engineering problem
We taught this activity at Maui Community College in Professor Mark Ho ff man’s Elec-tronic Instrumentation course to approximately 25 first- and second-year students ma-joring in Electrical Engineering Technology. The bulk of the hands-on investigationigital Image Exploration 3encompassed image encoding and then decoding. Students were provided with astro-nomical images on paper, a light box, and a photometer. Using these materials, studentsencoded their images into numbers. Students then wrote up their encoded images intoimage files, swapped them with other teams, and decoded a team’s image by drawingthe image (with chalks) from looking at the image file. Finally, students moved to thecomputer lab to experience more in-depth digital image manipulation. Table 2 showsthe activity timeline; we discuss the activity components in more detail below. Table 2. The top-level schedule that we used in our digital images inquiry.
Day 1 Day 2
Intro to Culture of Communication 10 min. Intro: Day 2 10 min.Intro to Inquiry 5 min. Focused Investigation: 30 min.Intro to Digital Images 10 min. Image DecodingStarters 40 min. Prep for Sharing 15 min.Break 15 min. Sharing (“Jigsaw”) 30 min.(Facilitators sort questions) Move to computer lab 15 min.Starters Mini-Synthesis 15 min. Image Manipulation 30 min.Focused Investigation: 60 min. Discussion: 10 min.Image Encoding / Digitization Communication experiencedHomework Assigned 10 min. Synthesis & Closing 25 min.
Total Time 5 hrs.
A “Starter” is a brief, interactive pedagogical tool designed to stimulate student interestand engagement in a topic, and to present material relevant to subsequent componentsof an activity. We used four Starters, rotating all the students through each one inparallel. Each Starter was designed to introduce the students to a particular conceptrelevant to our lesson goals. We named our Starters “Photometer Playground,” “FlagReproduction,” “Pixels and Grayscale,” and “File Formats.” An instructing facilitatorwas assigned to each Starter station. After each Starter station, students wrote downtheir questions, comments, and observations about that station; these writings werecollected by the activity facilitators for later discussion.At the “Photometer Playground” we introduced students to the use of a photometerfor measuring the intensity of incident light; this tool was an essential component forthe Focused Investigation that followed. Students explored the use of a photometerto understand how brightness can translate into a number. They first observed 40-Watt, 100-Watt, and 300-Watt bulbs, and then explored the e ff ects of distance fromand projection angle relative to the light source on the photometer reading (see Figure1, left). Finally, students observed the photometer measurement when attenuating thelight through paper printed with large two-inch squares of white, gradations of gray,and black ink. This was to demonstrate that a grayscale image could be captured byshining a light through it and measuring the brightness with a photometer.The purpose of “Flag Reproduction” was to encourage students to think about howpicture information can be communicated. Students were paired o ff . One member ofeach pair had a printed picture of an international flag (chosen for a recognized format Morzinski, Crockett, and Crossfield Figure 1. Starters: Using photometers to measure brightness in “Photome-ter Playground” (left) and transmitting an image verbally in “Flag Reproduction”(right). and simple geometric shapes). The other member of the pair had a blank sheet of paperand colored markers. Hiding the blank page and the flag from each other, the firststudent described (verbally) the flag such that the second student could draw it (seeFigure 1, right). After doing their best to reproduce the flag, students viewed the resultand reflected on the process.
Figure 2. Comparing images at di ff erent resolution in “Pixels and Grayscale.” The next Starter, “Pixels and Grayscale,” introduced students to the ideas of pixelscale and bit depth (grayscale). A grayscale photograph of the moon was reproducedwith ten varying pixel scales and ten varying bit depths (see Figure 2). The twentyimages were arranged face-up on a table, and students examined them and wrote ques-tions or observations. Students were prompted to think of the di ff erences between theimages, and advantages and disadvantages of each way of representing the moon.We designed the fourth Starter, “File Formats,” to start students thinking abouthow images are recorded in digital formats. Students were presented with one simpleimage (a black and white pixellated “happy face”) encoded in a variety of formats (.eps,.fits, .jpg, .pgm, .png, .svg). The ASCII or hex data in each file was printed out on theback of each image page, and students were prompted to compare the pictures (whichall looked the same) and the ASCII / hex file formats, including both the header and bodyof the file formats (see Figure 3). Prompts asked students to think about the di ff erencesand the advantages and disadvantages of each.igital Image Exploration 5 Figure 3. Image (top left) and file data for the .pgm (lower left) and .eps (right)formats of the image. This was to illustrate representation of the same image in manydi ff erent file formats in the “File Formats” Starter.Table 3. An edited sampling of the questions and observations generated by stu-dents during the Starters, sorted into categories corresponding to learning goals. Transmitting images
How do you communicate scale within a flag?Less data = easier to transmit / process Measuring light levels with a photometer
I notice the measurement gets smaller when the photometer is farther awayThe darker the sheet of paper through which the light goes, the lower the readingHow much would turning o ff the room lights change the readings? Evaluating tradeo ff s Is there a sweet spot between good enough quality and too big of a file size?How many megapixels are needed for a sharp and clear image?Is there an advantage in using a short picture format vs. a long one?Some file formats are easier to be read by a human. Are these not as useful?
Information content
I believe that each pixel has its own number that represents its number in grayscaleThe image quality is not clear with limited pixelsThe more pixels there are, the overall quality of the images gets better and better
Image file formats
Why are there so many di ff erent file types?Each one is formatted di ff erently but all of them appear to be the same imageThe compressed formats JPEG, PNG are unreadableWhich format will produce the best quality image? Morzinski, Crockett, and CrossfieldTable 3 lists a sample of the questions and observations generated by studentsduring the Starters. The questions generated by students in our Starters were not useddirectly for the Focused Investigations. Rather, the questions were used to engage theircuriosity and introduce students to some of the concepts they would be exploring later.Students did not choose questions to investigate: instead, the investigations were builtaround a particular engineering challenge with images and digital image files. There-fore, after the break, we did a mini-“synthesis” of the Starters (Figure 4) by going overthe questions generated with the students to ensure the knowledge gained in the Startersbecame a shared classroom experience. Figure 4. Mini-synthesis of the students’ observations from the Starters.
Figure 5. Light box: An open box supports a sheet of plexiglass. Inside the box,a light bulb illuminates the image placed on top of the plexiglass. Measurements ofthe image brightness across the picture are made with a photometer. igital Image Exploration 7For the Focused Investigation, students were given a grayscale astronomy-relatedphotograph. Each team was also given an engineering challenge in the form of various“science cases” to stimulate di ff erent approaches. The goals focused either on optimiz-ing spatial or color (grayscale) resolution, as explained in Table 4. Furthermore, teamswere given a limited budget and a formula for the “transmission cost” per pixel andcolor bit. Their budget was $1000 and pixels were $2 each while colors were $50 each.This ensured that teams could not maintain the fidelity of the image in terms of bothspatial and color resolution, but rather had to make a tradeo ff . In anticipation of this,the “Pixels and Grayscale” Starter got students to think about information content andnumber of pixels or color bits in an image. Of course, students were also limited bythe limited amount of time they had to use the photometers – not all groups consideredthis during their planning! Students were told to record the image using letters andnumbers only, so that they could transmit the image to another team who would thenre-create the image with the goal as given, for example mapping sunspots. They usedthe photometers to do so, digitizing their images by hand using the light boxes (Figure5) as practiced in the “Photometer Playground” Starter. Table 4. Goals for encoding each image during the Focused Investigation. Eachteam had one image and one goal, focused on either spatial or color resolution.
Image Spatial Resolution Goal Color Resolution Goal
Sun Di ff erential rotation rate Temperature of sunspots Map sunspots in time Brightness of sunspots
Moon Elevation topography Temperature of rocks
Map maria & terrae Brightness of rocks Jupiter Rotation period Height of clouds
Map clouds in time Brightness of clouds
Saturn Ring structure Chemical composition
Map rings Brightness of atmosphere
For homework after the first 2.5-hour course session, students had to write up theirdigitized image along with a file format description (to provide directions on how todecode their image data). On Day 2, students swapped images and used the writteninformation to re-draw the image using grayscale chalks. This was an exercise antic-ipated in the “Flag Reproduction” Starter when students practiced communicating animage and in the “File Formats” Starter when students were exposed to di ff erent fileformats for describing the same image. After each team had reproduced another team’s image file, the decoded drawings werehanded back to the original team for sharing. Facilitators photocopied the drawings sothat each student would have a copy. We used a “Jigsaw”-style sharing in which eachteam split up and sent one team member to each facilitator to share their results to one-third of the class. This ensured that each student was responsible for all the material.Students made posters stating their science goal from Table 4, describing their tradeo ff sin encoding or digitizing their image, displaying the resulting drawing, and reflectingon the investigation. Figure 6 shows two students’ posters.Students presented individually to one of the three facilitators, and facilitatorsscored their presentations with a rubric (Table 5) as a tool to conduct a summative Morzinski, Crockett, and Crossfield Figure 6. Posters by students for sharing. assessment of the students’ learning. Our design team was one of the first PDP designteams to pilot use of a rubric for inquiry. We chose three categories on which to gradeeach presentation: describing the encoding process, describing the image file, and prac-ticing good communication skills. We expected students’ level of mastery to advance inproficiency from left to right along a row in the rubric. However, to allow for a studentachieving mastery at the last column along a given row yet missing one of the morebasic items in another row, we awarded students 1 point per cell.
After sharing what students had learned in digitizing and transmitting images by hand,we moved to the computer lab to do an exercise with images in the .pgm format. Stu-dents manipulated the numbers in a simple .pgm image of the moon, and then viewedthe results with the image displaying program
Irfanview . We provided students withprompts such as making the image darker or inverting the colors. This exercise rein-forced the idea that digital images are represented by numbers in arrays and that thevalues in the image body represent the brightness of each pixel.
Finally, we wrapped up the lab with a reflection on the di ff erent ways communicationexpert Kalei Tsuha of MCC had observed students communicating throughout the ac-tivity, followed by a synthesis lecture of what students had learned. For homework,students were asked to produce a report justifying their decisions in light of their con-straints and science goals. In this report, the students were expected to discuss thepossible design tradeo ff s, the limitations of their design, and how they might redesigntheir solution in the future.igital Image Exploration 9 Table 5. Rubric used for summative assessment.
Task Did not meetexpectations [ +
1] Met expectations[ +
1] Exceededexpectations [ + Student shows theiroriginal image, thedrawing anotherteam made of it, andexplains thescientific goal theywere workingtoward. Student describestheir image encodingmethod (photometerdigitization, vectorgraphics, or other). Student explains thetradeo ff s theyevaluated and givesreasons for choosingtheir image encodingmethod. Describeteam’simage fileformat,givingreasons.
Student shows theirimage file format,identifying theheader and body. Student explainswhat the header andbody mean, and whythe particular imagefile format waschosen to meet thescientific goals. Student evaluates theclarity of their imagefile format by thefidelity of the drawnimage, and suggestschanges they couldhave made to clarifytheir imageencoding.
Showcommuni-cationskills.
Student speaks andhas visual aids. Student speaksclearly and audibly,and has visual aidsthat are legible andappropriate. Student engages inrelevant discussionwith classmatesabout presentation.
3. Discussion
We asked the students to fill out written feedback forms to improve our instruction inthe future, and some concepts students wanted to explore further included more practiceencoding or digitizing images, more on image formatting and compression, and moreon image manipulation. Students rated each component of the activity on a five-pointscale and results are shown in Table 6. Students got the most out of the image decoding,poster sharing, and synthesis lecture.
Table 6. Student feedback on a five-point scale.
Activity Component Mean Score Std. Dev.
Starter 3.9 1.2Image Encoding 3.9 1.3Homework: File Creation 3.6 1.4Image Decoding 4.4 0.8Poster Session 4.5 0.8Computer Activities 3.9 1.3Synthesis Lecture 4.5 0.6This activity in Fall 2008 was a redesign of a similar Digital Image Files inquirytaught in Spring 2008. In the redesign we attempted to add more authenticity to the en-gineering challenge by both tying it to a science goal (e.g., a goal of mapping sunspotsto motivate a focus on optimizing spatial resolution) as well as the monetary budget0 Morzinski, Crockett, and Crossfieldconstraint. We got feedback from the course instructor Elisabeth Reader, in reviewingthe write-up assigned on Day 2, that many students still found identifying tradeo ff s tobe di ffi cult. Upon reflection, the budgetary constraint may have been too complicated,and it the future we would like to spend more time on clarifying the science goals inTable 4 so that students can better make tradeo ff s to optimize achievement of the goalin a more authentic way.On the whole, the inquiry was a success as students learned about digital images,pixels, transmitting and communicating images, and making tradeo ff s.As inquiry designers and facilitators, we feel we accomplished our goals in thisactivity. After the e ff ort involved in designing and teaching, we would be pleased to seeour work go farther and have thus made all the materials and a lesson plan available onthe website of facilitator IJC . MCC instructor Elisabeth Reader has already taught theactivity again with a new class, also finding it successful. Acknowledgments.
Lisa Hunter was an observer and design-team consultant atMCC, while Lynne Rashke was a consultant at the PDP workshop. UH-Maui professorsMark Ho ff man and Elisabeth Reader (the classroom teachers hosting this inquiry) andJohn Pye provided advice and assistance. Hawaiian language and culture expert KaleiTsuha consulted and contributed to the culture and communication portion. J. D. Arm-strong and Joe Masiero were PDP participants who designed the initial Digital Imagesinquiry at the 2007 PDP and taught the inquiry as a pilot in Spring 2008.This material is based upon work supported by: the National Science Founda-tion (NSF) Science and Technology Center program through the Center for AdaptiveOptics, managed by the University of California at Santa Cruz (UCSC) under co-operative agreement AST ffi ce of Scientific Research (via NSF AST References
Ash, D., & Kluger-Bell, B. 1999, in Inquiry: Thoughts, Views, and Strategies for the K-5Classroom (Foundations Series, National Science Foundation), 79Dow, P., Duschl, R. A., Dyasi, H. M., Kuerbis, P. J., Lowery, L., McDermott, L. C., Rankin, L.,& Zoback, M. L. 2000, Inquiry and the National Science Education Standards: A Guidefor Teaching and Learning (Washington, D.C.: National Academies Press)Hunter, L., Metevier, A., Seagroves, S., Porter, J., Raschke, L., Kluger-Bell,B., Brown, C., Jonsson, P., & Ash, D. 2008, Cultivating Scientist- andEngineer-Educators: The CfAO Professional Development Program. URL http://isee.ucsc.edu/participants/programs/CfAO_Prof_Dev_Program.pdf2