Non-rectilinear projection design for live cue-able theatrical performance

MISSING IMAGE

Material Information

Title:
Non-rectilinear projection design for live cue-able theatrical performance
Physical Description:
Book
Creator:
Powell, Brittany
Publisher:
College of Fine Arts, University of Florida
Place of Publication:
Gainesville, Fla.
Publication Date:

Notes

Abstract:
The goal of this project in lieu of thesis is the research and development of a digital projection mapping tool that is specifically designed for live theatrical performances and that is accessible, easily controllable, and cue-able. Working inside of the existing projection software, Isadora, the author constructed the Polygon Mapper, which gives the user complete control over mapping non-rectilinear objects while utilizing Isadora-specific functionality to manipulate and cue performances. This is unique to the theatrical field in that other mapping software lacks complete control, cue-ability, or accessibility. All three of these components must be present to achieve optimal functionality that can be exploited in any theatre, for any show, on any budget. This paper will review the history of projections in theatre, outline the research and development process for creating and manipulating the software, and provide a detailed manual for mapping using the software designed inside of Isadora. This Masters Thesis proposal has been accepted for presentation at the 2012 Southeastern Theatre Conference in March in Chattanooga, Tennessee under the presentation title name: “Projection Design: Video Mapping onto Non-Traditional Surfaces.”
General Note:
Digital Arts and Sciences terminal project

Record Information

Source Institution:
University of Florida Institutional Repository
Holding Location:
University of Florida
Rights Management:
All rights reserved by the source institution and holding location.
System ID:
AA00009518:00001


This item is only available as the following downloads:


Full Text

PAGE 1

! "! NON-RECTILINEAR PROJECTION DESIGN FOR LIVE CUE-ABLE THEATRICAL PERFORMANCE By BRITTANY POWELL SUPERVISORY COMMITTEE: ANGELOS BARMPOUTIS CHAIR TIZA GARLAND MEMBER PATRICK PAGANO MEMBER A PROJECT IN LIEU OF THESIS PRESENTED TO THE COLLEGE OF FINE ARTS OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS UNIVERSITY OF FLORIDA 2011

PAGE 2

! #! 2011 Brittany Powell

PAGE 3

! $! To Donna for being selfish, to Lear for br inging it full circle, to Tim for being the balance

PAGE 4

! %! ACKNOWLEDGEMENTS I thank the chair and members of my supervisory committee for their guidance and passion, the Digital Worlds Institute and the School of Theatre and Dance for their generous support, the design team of The Last Unicorn for their long hours and brilliant ideas, Anton Yudin for his altruism, and my friends and family for their constant encouragement, support and patience.

PAGE 5

! &! TABLE OF CONTENTS Dedication Page 3 Acknowledgments Page 4 Abstract 8 Introduction 10 CHAPTER 1. History of Projections in American Theatre 11 1.1. Agit-prop and Documentary Theatre 11 1.2. Documentary Theatre Re-defined 12 1.3. Piscators Formative Years 13 1.4. The New School 15 1.5 Tennessee Williams 16 1.6 The Artist Collectives 16 1.7 Bertolt Brecht 18 2 Research Process 19 2.1 Current State of the Industry 19 2.2 Software Overview 21 2.2.1 Software comparison chart 23 2.2.2 Isadora 24 2.2.3 MadMapper 25 2.2.4 Modul8 with free MapMapMap s oftware 26 2.2.5 Resolume 26 2.2.6 Video Projection Tool (VPT) 27 2.2.7 Pire Data (PD) 27 2.2.8 MAX/MSP/Jitter 28 2.2.9 Quartz Composer 28 2.2.10 QLab 29 2.2.11 Watchout 29 2.2.12 Green Hippo 30 3 Development Process 30 3.1 Isadora with Proposed Mapping Software 31

PAGE 6

! '! 3.2 Characteristics for Comparison 31 3.2.1 Dynamic mapping 31 3.2.2 Map manipulation 34 3.2.3 Media manipulation 35 3.2.4 Concaved maps 35 3.2.5 Anti-aliasing 35 3.2.6 Adding effects 36 3.3 Polygon Mapper Design 36 3.3.1 Video input 37 3.3.2 Mouse X and mouse Y 37 3.3.3 Key code 39 3.3.4 Anti-aliasing application 40 3.3.4.1 Anti-aliasing defined 40 3.3.4.2 Supersampling 42 3.3.4.3 Jitter sampling 44 3.3.4.4 Jitter points source code 45 3.3.4.5 Anti-aliasing within the Polygon Mapper 46 3.3.5 F value 49 3.3.6 Edit mode 49 3.3.7 Dot size 50 3.3.8 Vertex count 51 3.3.8.1 Adding and subtracting vertices 53 3.3.9 Video output 55 3.3.10 Animated manipulation effects 56 3.3.11 Media manipulation 57 3.3.12 Mapping concaved polygons 58 3.3.13 Adding effects 61 3.4 Cueing with Isadora 61 3.4.1 MIDI control 63 3.5 Trouble shooting 63 3.5.1 Tracking edit dots 64

PAGE 7

! (! 3.5.2 Resolution issues 64 3.5.3 Polygon Mapper unlinking bu g 67 3.5.4 Isadora issues with show control computer 68 3.6 Conclusion of Development Process 69 4 The Last Unicorn: A Performance Within a Project 69 4.1 Installations 70 4.2 Projectors, Sight Line, and Shadows 73 4.3 Projection Artwork 75 4.4 Installation Artwork 76 Conclusion 77 Footnotes 77 APPENDIX A. Manual 80 B. Source Code 104 C. Installation Photo Gallery 127 F. The Last Unicorn Production Photo Gallery 137 REFERENCE LIST 143 BIOGRAPHICAL SKETCH 147

PAGE 8

! )! Abstract of Project in Lieu of Thesis Presented to the College of Fine Arts of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Arts NON-RECTILINEAR PROJECTION DESIGN FOR LIVE CUE-ABLE THEATRICAL PERFORMANCE By Brittany Powell December 2011 Chair: Angelos Barmpoutis Major: Digital Arts and Sciences The goal of this project in lieu of thesis is the research and development of a digital projection mappin g tool that is specifically designed for live theatrical performances and that is accessible, easily controllable, and cue-able. Working inside of the existing projection software, Isadora, the author constructed the Polygon Mapper, which gives the user complete control over mapping non-rectilinear objects while utilizing Isadora-specific functionality to manipulate and cue performances. This is unique to the theatrical field in that other mapping software lacks complete control, cue-ability, or accessibility. All three of these components must be present to achieve optimal functionality that can be exploited in any theatre, for any show, on any budget. This paper will review the history of projections in theatre, outline the research and development pr ocess for creating and manipulating the software, and provide a detailed manual for mapping using the software designed inside of Isadora. This Masters Thesis proposal has been accepted for presentation at the 2012 Southeastern Theatre

PAGE 9

! *! Conference in March in Chattanooga, Tennessee under the presentation title name: Projection Design: Video Mapping onto Non -Traditional Surfaces.

PAGE 10

! "+! Introduction Projections came to be in theatre for two reasons: a didactic awareness and a spectacle performance. Brought to the United States by way of Documentary Theatre, Erwin Piscator pioneered new stage techniques with projections in an effort to force social awareness on the audience Tennessee Williams, working against the philosophy, felt the theatre would be better served to use projections as a visualization apparatus of the human condition. Both viewpoints took hold in the American theatre industry and spread throughout the country and the generations. Today, p rojections in theatre serve one main purpose: to comment on the actor and/ or action. Take for example, The Sound of Music, the Von Trapp family sings and dances in front of a projection of the mountain ranges of Austria. The projection is setting a location for the action and drawing the audience into t he world of the play. Another example, The Laramie Project, the true story of Matthew Shepard who was brutally beaten to death for being homosexual, during the trial of the accused murders, anti -gay protest images are displayed on the screen to show the harsh reality of hate in America. Each is an example of projections in use based on Piscators and Williams ideas. The sentiment is different, but the overreaching theme of maki ng a statement about what or who is on stage is analogous The rapid advance of technology offers even more projection possibilities than ever. Projection mapping is a projection technique that transforms almost any 2D or 3D surface into a video display. Projection mapping is most often used in architectural installation and in live music shows when projections are the focal point. This technique opens up an entirely new realm of theatre yet it has limited reach into the purely theatre industry as of yet.

PAGE 11

! ""! The proceeding chapters of this paper discuss the history, developm ent and practical application of a technology that can combine both theatrical necessities and designer freedom. Chapter 1 History of Projections in American Theatre The projection industry has come a long way since the time of the Chinese shadow puppet s of the Han Dynasty1 and the invention of the magic lantern in the 17th century.2 Its place in American theatre today was carved out by a series of even ts that shaped the artists and, by extension, the art of the time. 1.1 Agit -prop and Documentary Theatre The starting point for projections in theatre begins with Documentary Theatre. Documentary Theatre was precipitated by agit -prop (agitation propaganda) theatre defined as an attempt to bring citizens to the proper point of radical awareness so th at they can afterwards be moved to voluntary action. 3 Agit-prop theatre, contrary to a form of pure entertainment, was a means of educating and communicating news to the illiterate populace beginning in the U.S.S.R. after the Revolution of 1917 It is analogous to a town crier and is characterized by spontaneous, unrehearsed troupes performing in public venues with no stage, costumes, props, etc. attempting to arouse support for social issues of the time.4 Agit-prop theatre is sometimes referred to as Political Theatre; however, the former is characterized by impromptu performances, where the latter is a more structured endeavor. Georg Bchner, a young German playwright, wrote Dantons Death in 1835 about the French Revolution. 5 This play is considered the first building block in the foundation of Documentary Theatre, which would not receive lasting world -wide impact until the 1920s by Erwin Piscator, who would later stage a production of Dantons Death in 1956 using projections.6

PAGE 12

! "#! Documentary Theatre can be defined simply in an interview with Emily Mann7, a well-known Documentary Theatre director and playwright, by author Gary Fisher Dawson. She explains, I usually ask [people] if they if they have seen any documentary films. Almost everyone has. I say, well thats what I do. I go out and find the event. I go to the place. I do a lot of work on it. I do a lot of research on it. I interview a whole lot of people. I find documents that have to do with that. Then I construct a play out of that. Im working from life and its very personal.8 1.2 Documentary Theatre Re-defined The advent of projections in theatre in the 1920s was introduced in a new form of Documentary Theatre referred to as Epic Theatre. Director and teacher, Erwin Piscator is considered the foremost practitioner of this genre of theatre. In Dawsons book, Documentary Theatre in the United States: An Historical Survey and Analysis of its Content, Form, and Stagecraft, the author illustrates the characteristics of Epic Theatre. Epic Theatre9 Through-line of action Focus is on the historical documented background of the event using montage and juxtaposition in place of exposition. Character Two -dimensional Factually based within a social context. Societal dynamic Past comments on the present. Assumes critical stance towards society by distancing the audience. Desires to transform society. Mise-enscne Total Theatre Acting Style Presentational Objective-acting

PAGE 13

! "$! Piscator explained that Epic Theatre was about the extension of the action and the clarification of the background to the action, that is to say it involved a continuation of the play beyond the dramatic framework.10 Epic Theatre was about the disillusionment of the audience, which forced them to confront the harsh reality of the social climate of the time. No longer was theatre to be an escape hinging on the audiences willing suspension of disbelief. A didactic play was developed from a spectacle-play.11 As social issues of the time made their way onto the stage, theatre techniques had to evolve to keep up. Instead of simply talking about the post -Nazi era on stage, photographs and newspaper clippings could be projected onto the stage, sound clips and loud speakers co uld bombard the audience, advanced staging machinery could link ed the inside of the theatre with the booming Machinery Age going on outside. All of this was done to give the audience a more realistic sense of time, place, and call to action. This is the premise of Epic Theatre and the idea of a Total Theatre, characterized by the use of all technical aspects in the theatre: projections, light, sound, machinery, staging, costumes, props, etc. Piscator brought all of this to the American stage. 1.3 Piscators Formative Years In 1915, Piscator, a German national, was conscripted into the First World War where he served 2 years at the front lines .12 During the war, he was commanded to work with the army acting groups which entertained the troops with pop ular comedy, crudely done.13 After the war, which Piscator blamed on capitalism, he joined the Communist party.14 His exposure to the violence of the war and the asinine theatre meant to block out the reality of the soilders situation shaped his repudiation of art for the sake of entertainment and he rejected all art which had no relevance to the real conditions of life.15 However, after returning to Berlin in early 1919 after the end of the war, he was introduced to the Dada movement by a friend. Dada was essentially an international artistic phenomenon, which sought to overturn the traditional bourgeois notion of art.16 Established in Zurich in 1916, Dada

PAGE 14

! "%! developed from a combination of factors that altered the worlds mindset the First World War, the French Revolution, technological advances in the Machine Age, Freud.17 Dada art had no rules or even guidelines. Art became a reflection of your inner-self and was created to provoke and offend traditionalists.18 One of the most notable pieces created during the Dada movement, or non-movement as it was sometimes called, was created by Marcel Duchamp painting a mustache and writing obscenities on a copy of the Mona Lisa.19 The non-movement reached Germany in 1918 and due to the post -war, politically charged social and economical climate, Berlin Dadists tended to be highly politicized.20 The Dada movement ended in the early 1920s and gave rise to Surrealism in 1924, which boasts artist s such as Salvador Dal, Joan Mir, and Ren Magritte. Well know Dadaists include Tristan Tzara and Hannah Hoch. Piscators involvement with the Dada movement was brief and ultimately reconfirmed his earlier principles that art must be based on current events and present a call to action to the audience. Dada did not produce the audience response Piscator sought. Despite his abandonment of the overall purpose of the movement, Piscator s works were heavily influenced by the theatrical techniques employed by artists of the movement. Ywan Goll, a foremost Expressionist of the time mostly accredited for his poetry, first introduced projections into his productions of The Chaplinade and The Immortal in 1920,21 four years before Piscator would implement the same effect in his production of Flags. Piscator used projections of stills and of motion pictures in his staging of Alfons Paquets Flags (Fahnen) at the Volksbhne in Berlin in 1924.22 This is considered by most scholars to be the revolution of Documentary Theatre and technical theatre. This was a shift that earned Piscator the reputation of bringing Epic Theatre and, thus, projections into mainstream theatre as we know it today. While Goll first brought projections to the stage, Piscator believed he gave them true meaning. He claimed [Flags ] was the first time to my knowledge that slide-projection had been used in this fashion.23 While projections are the most notable influence from Goll and the Dada

PAGE 15

! "&! movement, other techniques were implemented into Piscators productions.24 Goll used these techniques to allow the audience to transcend real life into the imagination, while Piscator used the techniques to confront the audience with real life.25 After the production of Flags, only one of Piscators productions, What Price Glory?, would be performed without the use of projections.26 1.4 The New School Leaving Europe in 1938, Piscator established himself in the United States by founding the Dramatic Workshop at the New School for Social Research in New York. Teaching courses on acting, directing, and playwriting, Piscator influenced an entire generation of artists including Tennessee Williams, Arthur Miller, Marlon Brando, Bea Arthur, Judith Malina (co-fou nder of the Living Theatre),.27 During his time in the United States, Piscator further developed his technique of Epic T heatre and the Dramatic Workshop is largely responsible for the mobilization of the Off -Broadway movement, which provided a venue for experimental theatre. Piscator fled to former West Germany after being subpoenaed by the House Un American Activities Committee (HUAC) in 1951 He never returned to theater in the United States.28 Yet, he would have a lasting impression on both performance and technical theatre. After Piscators abrupt departure from the Dramatic Workshop, his wife, Maria Ley Piscator became the director of the Dramatic Workshop and subsequent organizations.29 Piscators most notable contributions to the America theatre were his unique and experimental staging techniques, which can be seen in the Off and Off -Off Broadway theatres.30 The students Piscator educated while at the New School would go on to be come some of the most influential artist in American theatre taking with them a sense of lasting association with Epic Theatre techniques.

PAGE 16

! "'! 1.5 Tennessee Williams In a letter to his mother, Williams described Piscator as a terribly dictatorial German, completely impractical and to comply with his demands will destro y the poetic quality of the play.31 Although Williams loathed Piscators teaching practices and ideals on the nature of art, he still highly respected his theatrical techniques.32 Williams coined the term plastic theatre in his production notes for The Glass Menagerie referring to the use of projections, music, and lighting as expressionistic tools, not in attempt to avoid reality, but rather to approach experience more closely. 33 In contrast to Piscators use of staging techniques, Williams wanted the sentiment of the play to be manifested in a tangible fashion.34 Specifically, projections were meant to be used to emphasis certain scenes that held significance in the plot. The Glass Menagerie lends itself to this type of plastic theatre because it is a memory play giving the audience a chance to experience the emotional journey of the characters. The majority of Williams plays allude to the use of Total Theatre It is through Williams that projections in dramatic plays versus epic plays were introduce d into mainstream theatre. 1.6 The Artist Collectives The Living Theatre, founded by Judith Malina and Julian Beck, was the first wave of experimental theatre that came out of Piscators theatrical staging ideas and the Off Broadway movement in the 1960s.35 To this day, the Living Theatre is the oldest experimental theatre still in existence. Total Theatre was the instrument for this transformative theatre. Judith Malina comment ed on Piscators staging techniques remarking that Piscators innovative breakthroughs are essential!I think that modern theatre!couldnt be what it is unless Piscator had done what he did!his concept of Total Theatre was the use of everything we have and know!now he would be experimenting with laser beams and holograms in the theatre.36 Concepts from Piscators definition of Epic Theatre are seen in the Living Theatres productions and are evident in their mission stat ement as written by Julian Beck:

PAGE 17

! "(! To call into question who we are to each other in the social environment o f the theater to undo the knots that lead to misery, to spread ourselves across the public's table like platters at a banquet, to set ourselves in motion like a vortex that pulls the spectator into action, to fire the body's secret engines, to pass through the prism and come out a rainbow, to insist that what happens in the jails matters, to cry Not in my name! at the hour of execution, to move from the theater to the street and from the street to the theater. This is what The Living Theatre does today It is what it has always done.37 Other groups similar to the Living Theatre were developed in later generations. The anti -Vietnam War movement sparked another revitalization of the experimental theatres by way of Off-Off-Broadway in the early 1970s. The Wooster Group was among the collectives that emerged at the time. Distinctive for its combination of aesthetic and political radicalism with intellectual rigor,38 nearly all of the Wooster Groups productions have had a video or projection element since its creation in 1975. Similar to Wooster, the Builders Association, established in 1994 by Marianne Weems produced shows that blend stage performance, text, video, sound, and architecture to tell stories about human experience in the 21st century.39 Several other collectives in the United States and abroad utilize the techniques Piscator made relevant with his ideas of Epic and Total Theatre. However, he is often not given the credit for such ideas. Perhaps this has to do with his strict Communist demeanor and lack of respect for actors and playwrights. It is also possible that as a director and producer he did not have a widespread reach beyond his direct audience, as a playwright would.

PAGE 18

! ")! 1.7 Bertolt Brecht There is a debate among scholars as to who should be the credited as the inventor of Epic Theatre: Erwin Piscator or his contemporary, Bertolt Brecht. Brecht was a German playwright and director working in Berlin at the same time as Piscator. Brecht is undeniably the most important playw right to have emerged in Germany since the First World War.40 As far as the stage techniques generated from Epic Theatre, Brechts work illustrates that he not only appropriated Piscators ideas but he exploited [Piscators] distinctive techniques.41 Several months after Piscators production of Flags was staged with the use of projections, Brecht wrote Edward II, employing the same use of slide-projections and title-cards. In the years to come there was a multitude of instances where Brecht pilfered P iscators unique style.42 Brecht himself asserted that Piscator, who without a doubt is one of the most important theatre men of all times, began to transform [the stages] scenic potentialities. He introduced a number of far reaching innovations. One of them was his use of the film and of film projections as an integral part of the setting.43 Brechts own political theatre style would not be seen until his production of Man is Man in 1931, two years after Piscator articulated his idiosyncratic stage techn iques in his published work, The Political Theatre (Das Politische Theatre) in 1929.44 Whether scholars believe it be Brecht or, indeed, Piscator who brought projections into the theatre, one thing that can be agreed upon is that it is here. Projections are not solely in theatre either They have or are becoming commonplace in almost all areas of live performance and art.

PAGE 19

! "*! Chapter 2. Research Process 2.1 Current State of the Industry A general analysis of the type of projections that are in the entertainment industry reveal two major categories: background projections and installation projections. These genres are most prevalent in theatre and music or architecture. Methods for projection design vary because the focus of the audience changes. Background projections are most commonly seen in theatres because the focus remains on the actors and the action. A physical locale or abstract image is projected onto the screen with an artist edge that helps to further set the atmosphere or tone of the play. It is a very effective technique and has become increasingly more popular in mainstream theatre. Projections are used as backdrops because in dramatic theatre the most important element is the actor. The actor is responsible for conveying the st ory with the aid of projections, not the other way around. With projections in the background or at the most built around the actor, there is very little to distract or obstruct the view for the audience. The Broadway world of projection designers is small, at best. There are a handful of designers such as Zachary Borovay45 (Lombardi, Xanadu), Peter Flaherty46 (Sondheim on Sondheim ), Wendall Harrington47 (Having Our Say, Ragtime, Tommy) who have set the industry standard for how projections are utilized in theatres. The most common tool for projection design in theatre in Watchout, a video and image playback system that lets the designer place media and add limited effects. This is an ideal program for cueing a show, but with a price tag well over $10,000 for multi-screen projections, Watchout is mostly left on the Broadway stage. For most projection installation performances, there is a look but dont touch policy. These installations can be a large or small-scale gallery-type event, where the audienc e watches the performance unfold onto a unique medium using projection

PAGE 20

! #+! mapping tools, specialized software has been designed to warp and mask the projected image to make it fit perfectly on irregularly shaped screens or objects.48 Traditionally, rectilinear projections are rectangular images produced by a projector. The screens for these projections are typically mounted to a wall and replicate the shape of the projectors image. In the past the projectors image dimensions dictated the screen size and shape. Limiting the output two a four -sided box. Irregularly shaped screens or objects are considered to be non -rectilinear. The premise behind nonrectilinear projection mapping is that the projector no longer dictates the screen, but rather the other way around. The projected image takes the shape of its surface, regardless of the form. Having a 3 -dimensional cube as the surface for projections is considered to be non-rectilinear because the image output is not confined to the traditional four sides. A figure with 30 to 40 sides create a complex polygon, nonrectilinear projection map as seen in the example below from the Polygon Mapper actor created by the author in Isadora as the subject of this project in lieu of the thesis. Curved surfaces are also considered non-rectilinear. By using a large number of vertex points, curves can be mapped to outline the form of nearly any object. The projection map pictured to the right was created with the Polygon Mapper actor in Isadora and has 110 individual vertex points that create the shape of the object.

PAGE 21

! #"! Impressive examples of non-rectilinear projection mapping are seen in various venues across the world. Companies such as V Squared Labs49 and 1024 Architecture50 create astonishing imagery with projections for use in concerts and marketing endeavors Disney World has a spectacular new show called, Magic, Memories, and You that projection maps onto Cinderellas castle in Orlando, Florida. The audience stands back and watches as 16 different proj ectors illuminate the castle with vines growing up the turrets one moment and the next it bursts into flames. Installations like this sometimes have themes or even a story line but overall, the purpose is to give the audience a spectacular visual display. The focus is on the projected images and not on the actors, which is most likely why you rarely see actors in these types of projection installations. Programs such as MadMapper were created specifically for pr ojection mapping and have been integrated heavily into projection software programs for VJs. A VJ is a the disc jockey of video and the live mixing component of their work makes the programs they use less than ideal for a live theatrical performance where the actors and technicians are relying on precision timing and replicable cues. The intent of this project in lieu of thesis is to combine the cue-ability of Broadway productions with the wow-factor of non-rectilinear projection mapping and make them affordable and accessible to any theat re on any budget. 2.2 Software Overview The existing projection mapping software is robust as a whole. However, individually the different applications fall short of creating a complete theatrical package. The goal of my thesis project was to enhance the existing projection software Isadora to

PAGE 22

! ##! create a comprehensive program that meets the needs of theatres across the spectrum of experience and budget. This current software review will highlight the benefits of my design of the Polygon Mapper actor within Isadora in comparison to other applications available.

PAGE 23

! #$! 2.2.1. Software Comparisons* Projection mapping is a new and quickly evolving technology. Pricing also prohibits collection of information from some sour ces. The software comparisons are accurate as the available d ocumentation and resources at the time of authorship. **SDK is a software development kit allowing programmers to extend the program beyond its original scope. Isadora64 & Polygon Mapper Actor65 Isadora MadMapper66 Modul867 & MapMapMap68 Resolume69 VPT70 Pure Data71 Max/ MSP/ Jitter72 Quartz Composer (QC)73 QLab74 Watchout75 Green Hippo76 Under $400 $350 $350 $411 *Must use with other software $411 MapMapMap Free Plugin $411 Free Free $399 Free Limited: Free. Rentals for $3/day $4498 license for 1 display Unknown Hardware must be purchased Free Demo Free Upgrades (for new versions) N/A $50 for Audio/Video upgrade $50 for Audio/ Video upgrade Only 1 version as of 10/15/ 11 $49 $164 $550 Free Free $250 Free $599 A/V/Midi Bundle Version 2 $769 Unknown Education Discount N/A N/A N/A $275 $275 $274 $274 $207 $110 $449 Cue able User friendly Interface Unknown Dynamic Mapping Unlimited Vertices 3 or 4 3 or 4 4 4 4, 8,16 N/A N/A 4 N/A N/A N/A Unlimited Maps (per scene) 10 3 32 N/A N/A N/A N/A N/A Perspective Mapping (warping) Limited: Mac On ly with plugins & 4 vertices Limited: Mac Only with plugins N/A N/A N/A N/A N/A Concaved Maps N/A N/A N/A N/A N/A Map Manipulation N/A N/A Unknown N/A N/A N/A Media Manipulation Limited N/A N /A N/A N/A N/A Anti Aliasing Optimized & Adjustable Prerendered Not Optimized N/A N/A N/A N/A N/A Add effects via QC Limited Cross Platform (Mac/ PC) Mac Mac Mac Mac Preview Mode Unknown Multi screen Midi Live Video Input SDK** Record and export stages Pics only Unknown No Proprietary Hardware

PAGE 24

! #%! 2.2.2. Isadora Isadora is the award-winning, graphic programming environment for Macintos h and Windows that provides interactive control over digital media, with special emphasis on the real-time manipulation of digital video. Since every performance or installation is unique, Isadora was designed not to be a "plug and play" program, but inst ead to offer building blocks that can be linked together in nearly unlimited ways, allowing you to follow your artistic impulse.77 In Isadora, the blocks are referred to as actors and they each have a different functionality. This can be to display media onto the screen (Projector actor), to switch something on and off (Toggle actor), or to map an object (Polygon Mapper actor). Isadora has several unique functions, which guided my decision to expand its mapping abilities with the creation of the Polygon Mapper actor The most important feature is cueing. In theatre, or any live show for that matter, there are typically multiple acts and scenes. Each scene often requires a slightly or completely different mood or setting, conveyed with images or vid eos. Thus, it is important to have a method for cycling through those different scenes effectively.

PAGE 25

! #&! 2.2.3. MadMapper MadMapper is strictly a mapping application that was designed in Mac OSX to share video s with other applications. It is not designed as a show control system, therefore it cannot be cued, have multiple outputs to different projectors, or add effects to videos. Its only function is mapping. Its graphical user interface is simple, clean, and easy to understand. Yet, MadMapper cannot be used on its own to control the projections for live performance. Another software must be the main control. For example, a video can be mapped in MadMapper then sent to another program such as Isadora or Modul8 via a free third-party program called Syphon78. The projection map is siphoned from MadMapper into Isadora and controlled in the latter. Therefore, in order to use MadMapper in a cued performance the user must purchase both MadMapper and the control program. MadMapper is also not cross -platfo rm limiting its use to only Mac users. MadMapper can be siphoned into and out of Isadora, Resolume and VPT.

PAGE 26

! #'! 2.2.4. Modul8 with free MapMapMap software Modul8 is a video mixing and compositing program available only on Mac OSX that was designed for VJs, video jockey who mixes videos in real -time typically to music in concerts, nightclubs, and music festivals. The mixing happens in real-time by adding effects, cross fading, and manipulating videos. While this art is a type of pro jection design, it is not ideal for theatres since the video performance typically changes every night. In addition, the graphical user interface is crowded and difficult to decipher for a novice user. The MapMapMap plugin for Modul8 doe s have a user-friendly GUI but the mapping is limited to only 10 maps each with only 4 vertices. 2.2.5. Resolume Resolume is another popular VJ software that has limited mapping ability with only 3 or 4 vertices. Resolume, similar to Modul8, cannot be cued for theatrical purposes and lends itself to real-time video mixing. Unlike Modul8 it is cross -platform and has a mostly intuitive user interface.

PAGE 27

! #(! 2.2.6. Video Projection Tool (VPT) VPT is a free projection tool for both Mac and Windows that does have mapping capabilities limited to 16 vertices and 32 maps. The mapping function is most similar to my proposed Polygon Mapper actor for Isadora since it can create concaved maps with no warping of the image. It also works with Syphon but not MadMapper. It has very limited cueing capabilities and a complicated user interface. Minimal effects can be added to the media but it cannot be animated or manipulated the way the Polygon Mapper can. At present, VPT is unstable on Mac OSX and has a tendency to freeze or crash. 2.2.7. Pure Data (PD) PD has a construction similar to Isadora. Basic functional blocks are connected together with links to create actions. It has a minimalist user interface that is difficult to navigate and u nderstand. The vocabulary for understanding PD is extensive and presents novice users with an exponential learning curve. Pure data is cross -platform and compatible with Syphon. However, Pure Datas biggest downfall is its complexity and a lack of inherent cueing capab ility.

PAGE 28

! #)! 2.2.8. MAX/MSP/Jitter Max/MSP/Jitter has a construction and vocabulary similar to Pure Data, which is difficult to follow. The interface looks sleeker, yet PD offers more user information if you know where to look for it. Max/MSP/Jitter can be used in combination with MadMapper for video mapping, but still lacks the ability to be cued for theatrical performance and lends itself more to live video mixing, audio creation and 3D rendering. 2.2.9. Quartz Composer Quartz Composer is a very powerful Mac based software created by Apple and comes standard on all new Mac computers. It has functionality similar to Isadoras and PDs with building blocks to create actions. It has amazing graphic capabilities and a quad-mapper, which is limited to 4 vertices. The program is very difficult to understand and there are layers of information needed to understand how to build a patch from scratch. The program itself is not able to be cued, but can be cued through QLab It is only available fo r Mac.

PAGE 29

! #*! 2.2.10. QLab QLab is a Mac-based video and audio playback software. It allows for very limited manipulation of visual media and works in conjunction with Quartz Composer. It is a very effective tool for cueing a performance but has no mapping functionality. The program is free for a limited version and $600 for the full version. 2.2.11. Watchout Watchout is software used on Broadway and other large live performance venues that allows the user to send audio, video and gra phics to multiple displays Watchout does the exact same thing Isadora does for 50 times the cost. In order to project onto 6 different screens (the maximum Isadora can allows), you must purchase 7 licenses at $2250 each, one for the control computer and 6 for the display computers. Isadora can do the same under one license for $350. Watchout is an elegant program, but the price tag leaves it far out of reach from most theatre venues. Watchout does have geometry correction capabilities for the projectors, but is not able to dynamically map complex shapes.

PAGE 30

! $+! 2.2.12. Green Hippo Green Hippo is a highly specialized proprietary media server and software designed for timeline playback used in live music concerts, on Broadway, and other high profile live events. Creation of content is done in outside editing software, such as Final Cut Pro and After Effect. Media is composited in Green Hippo but not constructed. There are only a limited number of people in the world who are considered experts with Hippo Media servers. Currently, in North America there are 29 Hippo Experts.79 Chapter 3. Development Process Researching other projection software revealed several characteristics that should be integrated into the software extension of Isadora. Thr ee major aspects were necessary for my project to be a success. The mapping software had to be accessible to the mass public. Programs such as Watchout and Green Hippo are so highly specialized that the majority of theatres and university theatre program s simply cannot afford the technology. Software also had to be cue-able. This is imperative in the theatre to account for the different scenes and tones in any given performance. Complete control and versatility is also a determining factor in theatre. Most plug and play systems do not have this feature. The goal of my Polygon Mapper within Isadora is to provide a program that houses all of these features in one piece of software.

PAGE 31

! $"! 3.1. Isadora with proposed mapping software Isadora in conjunction with the Polygon Mapper actor, offers versatile dynamic mapping ability with an unlimited number of vertices, map manipulation, media manipulation, effects, and adjustable anti -aliasing. The current state of the actor does not integrate warping of the image it is mapping onto the object. Further iterations of the Polygon Mapper will include this functionality. It does, however, allow for concaved maps unlike other warped mapping software. 3.2. Characteristics for comparison In this section, characteristics related to the mapping, manipulation, and cueing will de detailed. More details on how to map with the Polygon Mapper Actor can be found in the Appendix: Manual. 3.2.1. Dynamic Mapping For the purposes of this research project, dynamic mapping is the ability to manipulate the vertices of a map by clicking on the map itself, or a control panel with a diagram of the map. In the mapping program MadMapper the top screen is the

PAGE 32

! $#! preview/control panel and the bottom screens is the outpu t screen. Manipulation of the map happens on the preview/control panel as well as the output screen. The program, Modul8 with free plugin MapMapMap also has a dynamic mapping interface where the mapping occurs on a control panel only. In other applications that have the ability to map such as Resolume there is interactive control but not dynamic control of the map. Interactive meaning manipulation is still done by the user but not to the extent of dynamic manipulation when th e actual map is manipulated compared to the values associated with the map being changed. The Image below illustrates the interface for Resolume and the mapping effect that is inherent in the program. Sliders on the right control vertices. Each vertex has an x and

PAGE 33

! $$! y slider, therefore, to map an object with 4 points, 8 different sliders must be set: top left X, top left Y, top right X, top right Y, bottom left X, bottom left Y, bottom right X, and bottom right Y. This is typically not an issue when mapping a small number of vertices such as 3 or 4, which is the extent Resolume allows. However, if you wanted to map an object with 27 vertex points, you would need 54 sliders. Yet, there is value in having a slider for each coordinate. At times, dynamic mapping can lack precision if the object is smaller or more detailed. In these circumstances, being able to exactly pinpoint the value for the vertex is helpful. Programs such as MadMapper and Modul8 do not have this functionality. In the developme nt of the Polygon Mapper for Isadora, both aspects for mapping, dynamic manipulation and value precision are available. This is a unique aspect for mapping that is not found in any software readily available to the public. When a patch is created (see manual for specific details on how to create a mapping patch), a Stage Mouse Watcher actor is used to allow the dynamic aspect of control. The S tag e Mouse Watcher is one of several actors inherent in Isadora. Dynamic manipulation is the primary way for mapping using the Polygon Mapper in Isadora. Similar to other programs discussed previously, moving the vertices on the preview screen can create the map. Manipulation can also occur on the output screen if desired. This is helpful if certain angles of the object being mapped cannot be seen from the control computers point of view. With a wireless mouse and keyboard,

PAGE 34

! $%! mapping can occur without looking at a computer screen, but rather directly at the object being mapped. See Appendix: Mapping Photos for pictures of this feature. 3.2.2. Map Manipulation One feature that all mapping software has in common is the ability to manipulate the map once it has been created. Manipulation in this section is defined as translation, rotation, and scaling. Translation refers to the movement of the entire map from one location on the screen to another. Changing the horizontal and vertical positions on the Projector actor inside Isadora does this. The Projector actor is used to display the image on the screen. The horizontal and vertical positions control where on the screen it is located. Rotation The map can be rotated 360 in either direction. The spin input on the Projector actor controls the maps rotation.

PAGE 35

! $&! Scaling is the same as zoom. The map starts at 100%. It can be scaled down to 0% and be scaled up to 700% of its original size. Scaling, or zoom is also controlled by the Projector actor in the zoom input. 3.2.3. Media Manipulation Only Isadora and MadMapper have the ability to manipulate the media once it is inside the map. This is an important feature in time -sensitive situations when there is no time to edit or crop the image. When an image is loaded into a map the entire image becomes the output map. Isadora and MadMapper have the ability to only show a portion of that image inside the map. If a director would like to test what the projections would look like if the image were cropped, that is very quick and easy to do with the media manipulation functions. Without these, the designer would have to take the image into editing software such as Photoshop, edit the image then upload it back into the program for the director to see. This process is eliminated with media manip ulation functionality. 3.2.4. Concaved Maps In its current state the Polygon Mapper does not have the ability to warp an image inside the map. It does, however, have the ability to create concaved maps, when the polygon cuts into itself. VPT is the only other mapping software that has the same characteristic. 3.2.5. Anti-Aliasing Anti-aliasing is the technique used to smooth jagged lines of graphics. A detailed explanation of anti -aliasing will be presented in chapter 3.3.4. The re are projection

PAGE 36

! $'! design programs such as MadMapper and Modul8 with MapMapMap that has anti aliasing built into the graphics rendering. This effect smooths the perimeter of the map. MadMapper offers optimized anti-aliasing and Modul8 has very limited anti -aliasing in its map. Other mapping programs VPT and Resolume, have no anti-aliasing. Even Isadora has no anti-aliasing on basic shapes rendered inside the program. The Polygon Mapper is the only mapping program that has the ability to control the optimization of the anti-aliasing along the map edges. 3.2.6. Adding Effects Common to all mapping software, except for MadMapper, is the effects library that can be applied to the media inside the map. Often times, more effects can be added than originally come with the program. These are called plugins. A popular plugin effects library is FreeFrame, which is available on Resolume, Pure Date, Jitter, Modul8, and Isadora. There are a variety of effects offered from FreeFrame including: PanSpinZoom, Kaleidoscope, Fish Eye, Gaussian Blur, and Glow. My design of the Polygon Mapper takes advantage of these plugins to create a more robust mapping package. 3.3. Polygon Mapper Design This section will provide a comprehensive assessment of how the Polygon Mapper was designed and functions, aspects that were touched on in the previous chapter. In addition, Isadora actors that are used in conjunction with the Polygon Mapper for editing will be detailed. The design premise of Isadora is the use of buildin g blocks that can be configured and linked together to create actions. This project in lieu of thesis was the design and deployment of a new block allowing Isadora to have a new, unique functionality, the Polygon Mapper actor.

PAGE 37

! $(! 3.3.1. Video Input Isadora uses the term video in to refer to a media input format that can accept movie and picture files. Isadora also has a Core Image (CI) upgrade that is available only on Mac that utilizes the GPU for image processing rather than the CPU CI actors are only compatible with other CI actors, can only be used on Mac and are differentiated by the word image in instead of video in. The type of media that can be used is the same in video and CI image inputs and generally, for theatrical projec tion design, there is no noticeable difference between the t wo. The Polygon Mapper was developed to be cross -platform to allow for more widespread use in a range of theatres. Therefore, CI inputs were not used in the Polygon Mapper. The video in on the Polygon Mapper can be linked to the video out of a Picture Player or Movie Player actor. Acceptable file formats for pictures and movies are: 3.3.2. Mouse X and Mouse Y Isadora simplifies its value and numbering system by viewing each screen o r stage, as they are referred to in Isadora, on a percentage scale from 0 to 100. No matter the resolution of the stage, the values are converted to a standardized format. For example, if the output stage is 320 x 240, the coordinates for the bottom right corner of the stage will be (100, 100) compared to (320, 240). Image files .JPG .PNG .PDF .PSD Video files .MOV .AVI .QTZ

PAGE 38

! $)! The mouseX and mouseY inputs on the Polygon Mapper are designed to receive informat ion from the Stage Mouse Watcher. The Stage Mouse Watcher actor watches for the mouse to enter the stage and then reports the horizontal and vertical positions (X and Y). Those values are then transferred to the Polygon Mapper inputs mou seX and mouseY. As vertices are mapped, the values from the Stage Mouse Watcher are recorded into the X and Y inputs for each vertex.

PAGE 39

! $*! For added precision, the slider for each value can be used to pinpoint the location of the vertex. A value can also be typed directly into the input. 3.3.3. Key Code The keyCode input on the Polygon Mapper is the receiver for the trigger, the Keyboard Watcher actor, which allows for cycling from one vertex to another for editing. The Keyboard Watcher looks for keys on the computer keyboard to be pressed, released, or both. The key range input property can be set to limit the range of characters that this watcher will see. When this watcher sees a character within the specified r ange, it will send the character that was typed out of the key output.80 The Polygon Mapper is programmed to listen for the left and right arrows to be pressed on the keyboard. Each key has a specific value associated with it. The left key value is 28 and the right arrow key is 29. In the key range input on the Keyboard Watcher the value -29 must be entered and the key output will be connected to the keyCode input on the Polygon Mapper.

PAGE 40

! %+! The vertex that is being edited is shown with a gr een edit dot. Idle vertices appear as red edit dots. The left and right arrow keys move back and forth from one edit dot to another in a complete circle. The right arrow key moves the active vertex clockwise, while the left arrow moves counter-clockwise. 3.3.4. Anti-Aliasing The real world not seen through a screen is made up of an innumerable amount of shapes; straight, curved, continuous, broken, etc. The human eye can perceive all of these shapes. However, our method for viewing these in finite formations in a digital form is limited. 3.3.4.1 Anti-Aliasing Defined Computer screens are made of discrete square pixels in a grid pattern, which are filled with light. When a vertical or horizontal line is drawn on a computer the pixels are filled completely. (left) The eye is not restricted to a grid the way digital sources are. Therefore, in order to get an accurate diagonal line, grid lines must be crossed. (right)

PAGE 41

! %"! This is not possible on a computer monitor because each grid square is one indivisible pixel that cannot display more than one color. To the left is a diagonal line on a computer screen without anti -aliasing. Anti -aliasing is an effect used to make jagged diagonal lines appear smoother on a computer screen or other pixel-based devices.81 In order for the diagonal line to appear smoother, the alpha channel (opacity) must be adjusted in the adjacent pixels. The line appears smoother because the edge is being blurred to some extent. (right) Below is an example using text. The red a has anti-aliasing applied to it, while the black a has none. When made smaller, the red a appears much smoother, while the black a still looks jagged.

PAGE 42

! %#! 3.3.4.2. Supersampling82 Supersampling is a technique used in a nti-aliasing. Samples are taken from each individual pixel to test two parameters of the pixel. Parameter 1: pixel location within the polygon. The first test determines if the given pixel is inside or outside of the polygon being displayed. Parameter 2: is the pixel an edge pixel. The test determines if the pixel is on the edge of the polygon, and calculates the color value that should be applied to that pixel in order to smooth out the jagged line caused by aliasing. The image below represents 1 pixel. A portion of the image being rendered crosses through this pixel. When no anti-aliasing is applied the center of the pixel is tested to see if it should be red or white. In the case below, this pixel would be rendered as white since the tested sample is in the white section of the pixel. This is how the jagged lines in graphics are created.

PAGE 43

! %$! Supersampling collects the color information from multiple samples inside each pixel to determine the output color for that pixel. In the example below the pixel has been split into 4 evenly sized sub-pixels and samples are taken from the center of each. One of the four samples is in the red area. Therefore, this pixel will be filled with the color that represents of red. (example of grid sampling) In the RGB (red, green, blue) color model, combinations of three values create the desired color. Each red, green and blue channel has 256 values ranging from 0 255. For instance, the values (255, 255, 255) prod uce white, while (0, 0, 0) produce black. Similarly, (255, 0, 0) equals true red, (0,255,0) true green, (0,0,255) true blue. All other colors are variations in between those values. One of the four samples taken from the pixel above lands on a red por tion of the polygon, and so the RBG value will represent of the red RGB channel. To achieve this, the program runs the following calculation: Full color value Total number of samples # Number of samples inside the color = Value of color for pixel 255 4 # 1 = 64

PAGE 44

! %%! The gradient below represents the colors and values that could be applied to the pixel depending on the 4 samples that were tested.83 3.3.4.3. Jitter Supersampling Optimized anti-aliasing can be added in the f orm of jitter points. In the previous example of grid anti-aliasing all four points are evenly spaced in the four quadrants of the pixel. The same principle would apply if the number of samples increased to 8, 16, and so on. All the points would be evenl y spaced. Jitter supersampling adds a randomization of points to get a better overall sample from the pixel. As seen here, it is possible with jitter supersampling to have a different number of points within the red, creating a different color of red to be filled in this pixel.

PAGE 45

! %&! 3.3.4.4. Jitter Points Source Code Programmers have worked with anti-aliasing for many years and have created an optimized randomization for sampling.84 These values were used in the source code for the P olygon Mapper and there are 8 different levels of anti -aliasing incorporated into the actor. Each level has a different number of samples taken in each pixel: 2,3,4,8,15,24 and 66. The randomization of each sample is given an x and y value based on the coordinates inside the pixel. The values are from -0.5 to 0.5 for both x and y with 0,0 in the middle of the pixel since each pixel has a value of 1 by 1. sum= pointInPolygon(info> mCoordinates, polySize, col0.208147, row+0.353730) + pointInPolygon(info> mCoordinates, polySize, col+0.203849, row0.353780) + pointInPolygon(info> mCoordinates, polySize, col0.292626, row0.149945) + pointInPolygon(info> mCoordinates, polySize, col+0.296924, row+0.149994); num_of_tests=4 ; The values in blue represent the coordinates of the jitter sample points based on 4 samples.85 Those points are plotted on the pixel below.

PAGE 46

! %'! 3.3.4.5. Anti-aliasing Within the Polygon Mapper Patches within Isadora vary in density based on the amount and size of media within it. A patch that contains still images will be slightly faster than those that contain video files. The same is true for patches with a combination of images and video plus various effects. Having a high level of anti -aliasing can slow the computers proc essor and cause lags. In live performances, it is imperative that cues trigger at the precise moment they are meant to. Too much lag in a show with data -heavy media could lead to severe lags or a complete crashing of Isadora or the computer itself. To help compensate for the variety of performance types and machines that might be used, the Polygon Mapper was designed with an important feature: the user has the ability to choose the level of anti-aliasing applied to the map depending on the how dense the pat ch is. The levels range on a scale from 0-7, zero representing no anti-aliasing being applied to the map. From there the number of sample tests that are performed in the code increases to 2,3,4,8,15,24 and 66 jitters points. Anti-aliasing is only performed when the map is initially drawn and when there are subsequent changes to it. When mapping a still image this effect cannot be observed. However, if anti-aliasing were constantly being applied, mapping a video file with changing frames would dramatical ly slow down the playback because the map would have to redraw every frame. Since anti -aliasing is applied when the map is being drawn or redrawn, it is recommended that when mapping in Isadora, the anti -aliasing be set to zero to avoid any issues. Changing anti -aliasing value temporarily slows media down to allow for recalculation of the alpha mask. Setting the anti -aliasing to Level 7 (66 jitter points) during a data-dense performance is discouraged since the number of calculations being performed on the pixels is substantially higher than at the lower anti aliasing levels (level 6 = 24 jitter points).

PAGE 47

! %(! The difference in the levels of anti-aliasing is apparent along the outer edges of the maps shown below.

PAGE 48

! %)!

PAGE 49

! %*! 3.3.5. F v alue The fValue input allows for t he verticesCount to increase or decrease by one this is necessary when adding and subtracting vertices from the total number needed In its current state, the Polygon Mappers dynamic abilities are set within the code and cannot be changed by the user 3.3.6. Edit mode The edit input turns the edit mode on and off. Since the values are received by with a Stage Mouse Watcher it was important to have a way to turn off the edit mode while still mapping on the stage. This is done by right clicking at any point while the mouse is inside the stage; any time this happens a trigger is generated. In order to utilize that characteristic, the right mouse down output of the Stage Mouse Watcher must be connected to the trigger input on the Toggle actor. The Toggle is designed to switch between two states, in this case on and off. The trigger out on the Toggle is connected to the edit input on the Polygon Mapper

PAGE 50

! &+! When the edit mode is on the green and red dots are seen at each vertex. They disappear when the edit mode is off. Also, when the mode is off the stage mouse watcher still transfers values to the mouseX and mouseY inputs. The Polygon Mapper is programmed to not pass values on to the vertex inputs. If the edit mod e function is on, as the mouse exits the stage the vertex point being edited follows the mouse off the stage, thus changing the map. 3.3.7. Dot Size The user has the option to choose the size of the edit dots for t he vertices ranging from 0-5. Dot size 5 represents a 5x5 pixel box, the same being tr ue for the other sizes. This functionality allows for more precise mapping of edges. The dot size can be reduced or eliminated to better see the corners of the map. It is diffi cult to map when using the zero dot size since you are not able to see which vertex is active, represented by the green dot.

PAGE 51

! &"! Having control over the size of the dots also comes into play when having a large number of vertices. In the picture on the left, there are 200 vertices with a dot size of 5. With such a large number of vertices, the dots appear as a solid line around the perimeter. Lowering the dot size differentiates the vertex points more clearly. 3.3.8. Vertex Count The most unique aspect of the Polygon Mapper compared with all other mapping software is the nearly unlimited amount of vertices. In a single map there can be 999 individual vertex points. The Polygon Mapper actor interface was designed to be dynamic. When a value is entered into the vertexCount input, the actor is populated with values for both the x and y coordinate of the vertex, labeled accordingly. The vertex coordinate for each point is equally distributed along the outer p erimeter.86 The values for the distribution of vertices are recorded in the x and y inputs in the Polygon Mapper. The vertex X and vertex Y are seen in the pictures below. Note: the first

PAGE 52

! &#! vertex is not Vertex 0 X and vertex 0 Y will always have the coordina tes 0,0 unless they are mapped to a different location.

PAGE 53

! &$! The disadvantage to having a large number of vertices with a dynamic actor is that the actor itself becomes extremely long. Isadora accommodates for this, but precision mapping with the sliders can be overwhelming. Fortunately, the main and recommended mapping technique is to work within the stage preview or output stage to dynamically map. 3.3.8.1. Adding and Subtracting Vertices Once a verticesCount has already been entered into the Polygon Mapper, the values automatically populate for equal distribution around the perimeter, as discussed in the previous section. This only occurs when changing the count from zero to another number (0 to X). The standardized user-friendly format for adding and subtracting has yet to be implemented. The proposed method is as follows: The stage is broken into four quadrants. x+1 One new vertex is added to the far corner of the quadrant where the active edit dot is located. x-1 The active edit dot is removed. x+several Several vertices are added to the far corner of the quadrant where the active edit dot is located. x-several The active edit dot plus the adjacent dots in the same quadrant are removed.

PAGE 54

! &%! X + 1 X 1 X + several X several

PAGE 55

! &&! 3.3.9. Video Output The video out output on the Polygon Mapper transfers the original media with the new map to the Projector, the actor that displays the image. This Projector actor controls which screen the image appears on through the stage input. Isadora can have 6 different stages, meaning there can be 6 projectors controlled by the program. It also controls the map manipulation discussed in the Software Comparison. Translation, rotation, and scale are controlled by the horz pos and vert pos, the spin input, and the zoom input respectively. The intensity or opacity of the map as well as the height and width of the image Two very important inputs on this Projector actor are the blend and layer inputs. The blend function has 3 settings: additive, transparent, and opaque. Additive simply blends the image with any other images that it is overlapping, while opaque completely covers other images over 100 % of the screen. The transparent mode is used when you have alpha masks in your image. The Polygon Mapper creates an alpha mask, therefore, if you have two overlapping masks the top projector must be set to transparent. The layer input also organizes multiple maps into a stackable layer. Up to 20 layers can be used.

PAGE 56

! &'! 3.3.10. Animated Manipulation and Effects87 A unique effect in Isadora that can be applied to the map is the ability to animate the translation, rotation, and scaling parameters. No other software on the market has so much control over all aspects of a map. This is a contributing reason to why the Polygon Mapper was developed inside Isadora. For example, adding a Wave Generator actor to the spin parameter of the Projector actor will cause the entire map to spin 360 continuously. The same can be done for the translation and scaling of the map. The Wave Generator generates a sine, triangle, sawtooth, square, or random wave cycle at a regular rate of speed.88 Even the vertex coordinates can be automated by adding an Envelope Generator to the vertex value you want to manipulate. An Envelope Generator is an actor in Isadora that allows you to smoothly ramp from a starting value to an ending value over a specified period of time.89

PAGE 57

! &(! In fact, any input with a val ue can be animated in Isadora, which opens up an unlimited number of possibilities for projection design. This relates to the idea of a building block system that allows the user to truly create art by having complete control over the medium. 3.3.11. Media Manipulation In addition to animating the map itself, the media that is inside the map can be translated, rotated, and scaled. This is a unique feature found only in Isadora in terms of mapping software. Using the FreeFrame plugin, the PanSpinZoom Actor, placed between the media player and the Polygon Mapper actor. If it were placed between the Polygon Mapper and the Projector, the PanSpinZoom actor would affect the entire map, not the media within it. Original Translate / Pan Rotate / Spin Scale / Zoom

PAGE 58

! &)! 3.3.12. Mapping Concave Polygons The construction of the polygon map was based on the Point In Polygon algorithm.90 Overall, this algorithm tests points to determine if they reside inside or outside of the polygon. If they are outside, the image being mapped is hidden. If inside, the image is revealed. Taking into account the coordinates of x and y for each vertex, the number of sides created by the vertices and the width and height of the stage the algorithm is performed for all the coordinates in the stage as the map is being constructed. The source code is as follows: int pointInPolygon(float* coordinates, int polySides, float xx, float yy,int width,int height) { int i, j = polySides 1 ; //delete 1 to occlude polygon// bool oddNodes = false; float x=xx*100.0/width; float y=yy*100.0/height; for (i=0 ; i < polySides; i++) { if ( (coordinates[(i 2 ) + 1 ] < y && coordinates[(j 2 ) + 1 ] >= y) || (coordinates[(j 2 ) + 1 ] < y && coordinates[(i 2 ) + 1 ] >= y) ) { if ( ( coordinates[(i 2 ) + 0 ] + (y coordinates[(i 2 ) + 1 ]) / (coordinates[(j 2 ) + 1 ] coordinates[(i 2 ) + 1 ]) (coordinates[(j 2 ) + 0 ] coordinates[(i 2 ) + 0 ]) ) < x ) { oddNodes=!oddNodes; } } j=i; } if(oddNodes==false) return 0 ; else return 1 ; } The image to the left represents two vertices that have been mapped with the Polygon Mapper as indicated by the red edit dots. They are connected by a line, which do es not appear on the stage while mapping. For the Polygon Mapper to work successfully it needs to know which side of the imaginary line to allow the image to be seen.

PAGE 59

! &*! While mapping, if you only allow for two vertices to be used, no image is seen since a minimum of 3 vertices creates a polygon. This images below show a closed polygon. The program now must know where to put the image: inside the polygon or outside the polygon. In the Polygon Mapper this line of code determines that the image should be filled inside the polygon. If the 1 were deleted it would inverse the polygon map int i, j = polySides 1 ; //delete 1 to inverse polygon map// The determination for which side of the imaginary line is inside or out side is done by a simple test. Points throughout the screen are tested (purple dots) and compared with the other Y coordinates that fall on the line created by the vertices (blue dots). If the number of blue dots on either side of the test point is odd, the point is inside t he polygon. If the number of blue dots on either side of the test point is even, the point is outside the polygon. This test allows the Polygon Mapper to create complex concave maps.

PAGE 60

! '+! In line A there are 2 blue dots on either side of the purple dot. An even number indicates that the point is outside of the polygon. In line B there are 3 blue dots on either side of the purple dot. An odd number indicates that the point is inside the polygon. Therefore, when the map is created it will look lik e the image below. Another example of a concaved map is seen below. The picture on the left represents the non-rectilinear vertex map. The picture on the right is the same map projected onto the object.

PAGE 61

! '"! 3.3.13. Adding Effects A variety of effects can be added to the media itself. For a detailed explanation and other examples of how this is done in Isadora with the Polygon Mapper, see the Appendix: Manual. To add an effect in Isadora, insert the effect actor in between the media player and Polygon Mapper actor via the rbg in or video in inputs. 3.4. Cueing with Isadora The most theatre-friendly aspect of Isadora is how clear and easy it is to cue a live performance. Using the Keyboard Watcher actor, used previously, along with the Jump actor, any production can be triggered with an unlimited amount of cues. Below is a very common method for cueing in Isadora. Each scene receives three jump commands: go to next scene, go to previous scene, go to first scene. The Jump actor has two modes, relative and absolute. Relative allows you to go plus or minus x number of scenes relative to your current scene location. This mode is the easiest way to set up linear cueing through scenes. Absolute mode a llows you to select the exact number of the scene you wish to jump. This is the ideal setting for moving to the beginning of the

PAGE 62

! '#! production. In either mode the user can choose exactly which scene they wish to cue. Cueing can be done linearly or non -linearly. This aspect is helpful if there are images repeated in a show with heavy media. Instead of adding another scene that is exactly the same, Isadora can be programmed to jump to and from any location in the scene timeline The Jump actor also controls the fade between scenes. The Keyboard Watcher controls how the cues are triggered. In the example below the space bar triggers the next scene. This is done by entering space bar it appears as in the key range input. When using non-numerical values in the Keyboard Watcher single quotes must be on either side of the letter. Numerical values do not need single quotes. Note: As discussed previously, each key has a value associated with it. Zero 9 are the actual values 0-9, two digit values correlate with non-numerical keys. Isadora can also trigger from a control panel that the user creates to fit the individual needs of the performance. During the run of the show, the only program interface that is seen is what the user designs.

PAGE 63

! '$! Cueing from the Control Panel is done by connecting Buttons in the Control Panel with Jumps in the Scene Editor via Control IDs There is no need for Keyboard Watchers. Other devices can control the cueing of Isadora includ ing MIDI controls, joysticks, touchscreen applications, etc. 3.4.1. MIDI Control MIDI is a protocol that allows computers and other electronic equipment to communicate and synchronize with each other.91 Detailed explanation of MIDI control is beyond the scope of this paper. However, it should be noted that Isadora has the ability to trigger other application such as Pure Data and Abelton92 (a sound playback software). In a single press of a button, projections, lighting, and sound can be cued simultaneously. Isadoras cueing ability makes it the ultimate show control for an affordable price. Automatic, synchronized cueing eliminates the need for multiple board operators, a necessity when working on a small budget with limited resources. 3.5 Troubl eshooting Throughout the design process several issues arose from the design of the Polygon Mapper to the practical deployment on the show -control computer. This section will address several of the major difficulties and how they were resolved.

PAGE 64

! '%! 3.5.1. Tracking the Edit Dots An issue that was first encountered when using the Stage Mouse Watcher was the vertex being edited would automatically jump to the location of the mouse. If the user is editing vertex 1 the mouse is located at tha t vertex. When the user uses the arrow keys to move to vertex 2, vertex 2 would relocate to vertex 1 because that is the location of the mouse. When initially editing a map this was not an issue However, it creates a very time-consuming problem when fine -tuning a map. The vertex would jump to the arbitrary point of the mouse thus limiting the ability to slightly adjust a vertex quickly. This issue was resolved by creating a "sticking" effect with the mouse and edit vertex. Once a vertex is made active, the "dot" will stay in place until the mouse moves within a 25-pixel zone of the vertex. At that point, the "dot" will attach itself to the mouse and follow it to the desired location. This function prevents the vertices from jumping to random and unwanted coordinates. Further testing on a large scale revealed that using a 25-pixel zone was too large. It was scaled down to 10 pixels for more exact mapping. 3.5.2. Resolution Issues Standardizing file formats and resolution sizes is a common practice in digital media All files used in a single performance should be the same resolution: 320 x 240, 480 x 640, etc. However, this is not always possible, nor should a lack of standardization result in an improper functioning of the software. In previous versions of the Polygon Mapper, the actor expected the same resolution size for the media. If a 320 x 240 image was uploaded into Isadora and then mapped, changing the image in side the map to another 320 x 240 images would have no effect on it.

PAGE 65

! '&! However, changing from a 320 x 240 image to a different resolution size for example 640 x 480 would randomly do one of t wo things. Relocated and resize the map bas ed OR An image similar this would appear on the new resolution. The former image represents the map at 320 x 240 appearing with the same dimensions only on a larger image of 640 x 480. The latter is a more confusing image to deciphe r. When an image is loaded into the actor, an image buffer is allocated for that image. An image buffer is a piece of memory in the computer that holds that images resolution. If the initial image is 320 x 240, the computer memory allocates an array of data at those dimensions. When the image is changed to 640 x 480 the image buffer remains at 320 x 240. As the computer

PAGE 66

! ''! tries to display the 640 x 480 dimensions it reads the first two lines of 320 pixels as the first line of 640 pixels. The array in the image buffer can be thought of as boxes, each representing one pixel. For the illustrated example below, the image size is 16 x 12. Note: In the computer memory the boxes are actually in a continuous line. This blue and red striped image is loaded into the Polygon Mapper. A 16 x 12 image buffer set of boxes is allocated to each pixel in the image. In the computers memory the first line of 16 boxes is red, the second line is blue, etc. The same image with a resolution of 32 x 24 is loaded into the Polygon Mapper after an image buffer has already been allocated, the new image buffer will use the boxes that are already loaded to try and fill the image. The first red line of 16 boxes will remain, but in an image wit h 32 pixels in the first line, the second set of 16 pixels, which is blue, will be used to finish the remaining line of 32. Once 6 lines of 32 pixels have been used for the new image, the original image buffer has no more memory assigned

PAGE 67

! '(! to that image. T herefore, the computer fills in the remaining lines with anything that is in the computers random access memory (RAM). To avoid this issue, all images loaded into the Polygon Mapper are made resolution agnostic by turning their resolution into a perce ntage on a scale of 0 -100. 3.5.4. Polygon Mapper Unlinking Bug The only issue, or bug, in the code for the Polygon Mapper that affects functionality is the unlinking of the Polygon Mapper from the Projector once mapping has been done. After the values of the map have been set in the Polygon Mapper, if you delete the link between the Polygon Mapper and the Projector, Isadora will crash. This crash does not seem to happen when you delete any other links from the Polygon Mapper. I suspect this happens because of a discrepancy in the way the map is drawn and displayed. Therefore, when the display is interrupted, unlinking the Projector, the Polygon Mapper can no longer output the image and crashes. Further testing must be done to debug this issue. Since Isadora is so versatile, there is a way around this. A User Actor is a subpatch that can be added to any patch, two or more actors connected together by a link. This subpatch holds additional actors that have certain functionality in the patch as a whole. The Polygon Mapper, along with the actors used to control it, can be placed inside a User Actor. This avoids the Polygon Mapper from ever being unlinked directly from a Projector.

PAGE 68

! ')! Double clicking on the User Actor opens it and the entire Polygon Mapper patch should be placed inside. A User Input and a User Output connects to the video in and video out on the Polygon Mapper. If you unlink the Polygon Mapper User Actor from the Projector, Isadora will not crash. 3.5.4. Isadora Issues with Show Control Computer The show control computer for this project in lieu of thesis is a Mac Pro OS X 10.6.8 with the most recent version of Isadora installed, version 1.3 of 21. Isadora and the Polygon Mapper had been functioning pr operly with the Mac for weeks. Unexpectedly, Isadora began to crash. Thinking there was a bug in the Polygon Mapper actor, it was removed from the computer. This did not rectify the problem. All possible reasons for the crash were meticulously ruled out: the Polygon Mapper, corrupt media, necessary software updates, video format issues. To no avail, Isadora continued to c rash. Eventually, the problem was narrowed down to the Pop-up Toolbox (right). Any time the user double-clicked inside the Scene Editor to get the Pop-up Toolbox, Isadora would freeze

PAGE 69

! '*! and inevitably crash. However, if actors were retrieved from the actual Toolbox on the left side of the interface, Isadora responded as expected. With this information, a crash report was submitted to Mark Coniglio, the creator of Isadora, outlining the issues. Within the hour he responded with more questions about the nature of the crash and asked to run a program that would compile the previous 3 crash reports and send them to him. The crash repo rts indicated that the issue was indeed with the Pop-up Toolbox. Mark, however, was not able to duplicate this problem, nor had he heard of any such issue in the past. Over the next two days, Mark and I worked together to fix the problem. Ultimately, Mark discovered that the issue was the drawing of the alpha blended edges on the toolbox, a cosmetic characteristic of the program. He made a few changes in the code and the new version worked on the show-control computer. That version was later released to the public as version 1.3 of 22. 3.6 Conclusion of Development Process The Polygon Mapper actor, similar to all the actors in Isadora, is a building block. The examples given in this paper are merely a small piece of what can be created. In the han ds of a seasoned Isadora programmer, the functionality can be altered or changed all together. The possibilities are truly limitless because of the expandability of Isadora. Chapter 4. The Last Unicorn: A Performance Within a Project Equally as import ant as the technology created for this project, was the practical deployment of the Polygon Mapper in a live performance. In conjunction with the Digital Worlds Institute and the School of Theatre and Dance (SoTD) at the University of Florida (UF), the Polygon Mapper with Isadora was integrated into the production of The Last Unicorn.

PAGE 70

! (+! The Last Unicorn was originally a novel written by Peter S. Beagle93 and published in 1968. It is a timeless fantasy story that follows the journey of a unicorn who believes that she is the last of her kind and sets off to find others like her. The novel gained huge success and in 198294 an animated film was released with Mia Farrow as the Unicorn and other vocal talents such as Jeff Bridges and Ang ela Lansbury. Since 1968, the novel has never been out of print and in January 2011 a graphic novel was released. In 2002, The Last Unicorn was adapted for the stage by playwright, Le-Anne Garland, and since then only 2 staged readings have been performed. It was an honor to be granted permission to produce a studio theatre production as the vehicle by which to showcase this project in lieu of thesis. Together, with the Tiza Garland, an Associate Professor at UFs School of Theatre and Dance who agreed to direct the play, a design team and ensemble was brought together to create a production that could transform a flat, empty performance venue into a 3 -dimensional world. 4.1 Installations Designing the installations for the performance of The Last Unicorn was a complex process with many things to consider. The production space was at the Digital Worlds Institute in the Research, Education, and Visualization Environment (REVE). The REVE is unique becaus e it has five 17 x 17 screens that surround the stage to create a semiimmersive environment. The plan was to incorporate the use of the REVE in-house screens and projectors as well as two other projectors for the installations. The shape of the final installations, Tetris-like configurations, served as both set pieces and projection surfaces. This

PAGE 71

! ("! entire performance was produced with no budget. The wood, construction space, housing space for the installations was all borrowed or donated. The entire set was built out of scrap pieces of plywood. The REVE is a functioning classroom and theatre. Therefore, the set had to be constructed in a way that allowed for easy break down and condensing. This led to the modular design. There are 15 separate pieces that connect together to build the 3 installations. They are connected by loose pin hinges or barrel bolts that allow for fast and easy break down. At the beginning and end of every rehearsal the set was taken apart and moveed out of the way for the following days events. If needed, the set could be compact into itself and be housed in a 10(w) x 3(l) x 5(h) space. The 5 in -house projectors are mounted to the ceiling roughly 18 high. The sight lines for the REVE house projectors were taken into consideration when building the set. It was determined that the installations had to be 8 feet away from the wall to be 6 feet high. If the installations were any taller or closer to the screen, the REVE house projector image would be seen on the t op of the installations. The installations also had to be sturdy enough to withstand multiple actors sitting, standing, climbing and jumping on at the same time. It was made out of $ and % plywood and the modular design helped the soundness of the set by offering more weight bearing supports compared to building it as one unit.

PAGE 72

! (#! The installations needed to have a functional use within the storyline. After much collaboration with the Director, it was deemed necessary to have three different types of installations. The first was the Arch or far stage left installation. At several times throughout the play, the scene takes place at an entryway of some kind. It was important that the installations created a 3-dimensional playing space and not simply have the setting be seen on the 5 screens in the background. The installations became the entrance to the castle and the doorway to the scullery. The overhanging feature of the arch provided that projection space without being overt. The goal was to have a set piece that could be a doorway one minute and a tree the next. Next to the Arch was another tall installation piece that helped to frame the feeling of a doorway, the center stage left installation (the Tall One). It also lends itself to the several forest scenes with trees projected on it. At one point in the show, a tree comes to life, so the actor was able to perch herself onto the installation and have the tree projected onto her body. Both stage left installations are 3 x 3 x 6. An installation that allowed for more interaction with the actors was created for stage right. The installation was longer than it was tall for actors to stand or sit on, use as a table, etc., the stage right installation was designed with the dimensions of 8 x 5 x 4. Several times,

PAGE 73

! ($! throughout the performance, the stage right installation or mound needed to change to compliment the action. For example, when the king sits in his throne, the center section folds back to give the semblance of a throne. There are two different moving pieces on the mound which are changed throughout the play to fit the needs of the scene. All four configurations can be found in the Appendix: Installation Photo Gallery. When first designing the moving components I was concerned about the ability to standardize the different configuration. If the reconfiguration was not in the same place as previous scenes then the projections would be off. To avoid this issue, each moving piece is attached to a piano hinge, which regulates the movement so the map and the set will always match. Two Christie projectors were mounted to the ceiling of the REVE below the center in-house projector. These projectors were used to project onto the installations. The left projector was used for the mound and the right was used for both stage left installations. When determining the final placement of the set pieces, the image size of the projectors needed to fill the entire space of the installations. With a throw distance (distance from projector to destination) of approximately 20 feet, the image size of the Christie projectors was 18 $ x 14 $. 4.2 Projectors, Sight Lines, and Shadows Placement of the installations was determined by the image size of the Christie projectors, the sight lines of the in-house projectors, as well as, the shadows created by the installations themselves. In the design of the installations, the shadows that would be created if a higher portion of the installation protruded out farther than a lo wer

PAGE 74

! (%! portion were considered. The installations were designed so that the deeper portions were at the bottom and it became more narrow as they got taller. Generally when designing an installation the projector is pointed straight at the object eliminatin g shadows of this nature. However, working within the confines of the space, the installation projectors were mounted on the ceiling and aimed downward at an approximately 20 angle. In addition to the vertical angle of the projector, the horizontal angle of the installations had to be modified to minimize shadows on the set pieces. In the original location, there were several shadows created by the installation. After turning the installation angle slightly, the shadows disappeared. Going into this staging of The Last Unicorn, the director and I acknowledged that sight lines would be an issue, both for the audience and for the actors. The installations go all the way from the floor to between 4 and 6 feet high. If the audience sits in no nraked seating, seeing the lower portion of the installations would have been an issue. To avoid this, all the seating was removed from house and audience members were seated on the floor. There were chairs in the back of the house for elderly patrons. Floor seating provided a more casual atmosphere that the play required. It also rectified most of the audience sight line issues. Since the actors playing area was on, around, and in front of the installations, avoiding the projections hitting the act ors was impossible. Of course, when an actor

PAGE 75

! (&! stood next to the installation, there were no issues with the projections because they were mapped exactly to the set pieces. Standing in front of or sitting on the installations, on the other hand, meant the projection would be directly on the actors person. To combat this artifact of circumstance, a balance had to be created between projections and lighting. Moments where the action was more significant, actors would be well lit by the lighting, thus washin g out the projections on their body as well as the installations, and visa-versa. Projections being washed out by lighting versus actors not being well lit is a constant battle in theatre. Having a projector with a high luminosity or brightness helps to overcome this problem. The Christie installation projectors have 5800 lumens.95 To put that in perspective, a standard 40-watt incandescent light bulb has 500 lumens96 and the 16 projectors used in Disneys castle mapping performance each have 20,000 lumens The projectors used for the installations in The Last Unicorn are decent projectors, however, any Digital Light Processing (DLP) projector can be washed out with enough environmental lighting. 4.3 Projection Artwork The goal for the staging of The Last Unicorn was to use only original artwork. To do this, a Scenic Content Designer, Elaine Sponholtz, a Masters graduate from the Digital Worlds Institute, was brought onto the design team to design and paint t he artwork that would become the backdrops for the scenes and the installations. A very important technical aspect of creating physical artwork for projected imagery is ensuring the digit ization of the art is created to the exact resolution size as the final projection screen. The 5-screen system at the REVE is 7000 pixels by 1050 pixels. Each screen is 1400 x 1050. A single contiguous backdrop was created by painting 5 individual panels that fit together to make one scene. Each panel was the exact dimensions of one screen, 19.4 x 14.6.

PAGE 76

! ('! The first attempt to digitize the art was via scanning each panel in an 11 x 17 scanner and blending the edges together. The scanner resolution was set much higher than the screen resolution. When the scan image tests were pro jected a noticeable moir patterns appeared in the art. Moir patterns (see image on right97) are created when a pattern in the subject overlaps an opposing pattern in the medium.98 For example, the pattern in paintbrush strokes is offset from the lines in a scanner or computer screen. Ensuring matching resolution for all artwork and projector is the only way to eliminate the moir artifact. Photographing the artwork was the best way to digitize for our purposes. It also allowed us to standardize the resolution without having to blend edges together. 4.4 Installation Artwork The artwork for the installations was specifically created with the set pieces in mind. Using the silhouette of the installations as guide, the art took similar form to its surface. The Christie projectors were at a resolution of 1280 x 1024 and the artwork for it was the same. The installation artwork was treated as an extension of the 5 -screen backdrop. At times the backdrop was a forest, a road, a scullery, or a cave. The installation artwork coordinated with a tree, haystacks or a cupboard. When it was not necessary for the installations to be specific set pieces, textures were projected onto them to set the atmosphere of a cave or beach.

PAGE 77

! ((! Conclusion This project is unique because it is not based solely in theory or in a p roof of concept. An accessible, controllable piece of technical theatre has been created. The Polygon Mapper expands the scope of the industry by pulling the projections out of the background and into the action, while still providing the balance and control that is needed in a Total Theatre environment made available to all types of theatres on any budget This project in lieu of thesis represents time well spent in graduate school and the creation of a new career. In the words of Zachary Borovay: I am a projection designer. I am not a lighting designer, although I use a specialized lighting instrument to convey my design. I am not a scenic designer, although my imagery can be graphic or scenic in nature. I am not a sound designer, although my media may include an audio element. I do not see my job as a stepping-stone to any other discipline. I do not aspire to be a lighting, scenic, or sound designer. I am perfectly happy being a p rojection designer.99 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!! 1 Bursill, Web. 2 Magic lantern Wikipedia. 3 Szanto, 23. 4 Innes, 23. 5 Dawson, 1. 6 Innes, 184. 7 Emily Manns credits include Still Life (19), earning Mann 6 Obie Award nomination s, including Distinguished Playwriting and Distinguished Direction and Having Our Say (1995), nominated for 3 Tony awards including Best Play and Best Director. LoBiondo. 8 Dawson, 5. 9 Dawson, 8. Partial table from text. 10 Piscator, 182.

PAGE 78

! ()! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!! 11 Ibid. 12 Erwin Piscator (1893-1966) 13 Innes, 16. 14 Tytell, 27 15 Innes 17. 16 Hopkins, Introduction. 17 Hopkins, 1. 18 Esaak, Web. 19 Ibid. 20 Hopkins, 11. 21 Innes 17. 22 Probst, 29. 23 Innes, 17. 24 Innes, 18. Other influences include: the exaggeration of physical character through masks, experimentation with gramophones, breaking the illusion between stage action and audience in order to alienate the audience. 25 Innes, 18. 26 Innes, 107. 27 Probst, 9-10. 28 Dawson, 59 29 Probst, 16. 30 Probst, 14. 31 Leverich, 440. 32 Leverich, 346. 33 Bloom, 23. 34 Kramer, Web. 35 Savran, Introduction 36 Dawson, 60. 37 Beck, Web. 38 Savron, Introduction 39 The Builders Association. Web. 40 Innes, 189. 41 Innes, 198 42 Innes, 199 43 Brecht, 77 44 Innes, 192-3. 45 Zachary Borovay Personal Website 46 Peter Flaherty Personal Website 47 Wendall Harrington Personal Website 48 Dodson, web introduction. 49 V Squared Labs. 50 1024 Architecture. 64 Troika Tronix Isadora. 65 Proposed software developed and outlined in this thesis 66 Mad Mapper

PAGE 79

! (*! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!! 67 Modu8 VJ Software 68 Modul8 with MapMapMap Blog. 69 Resolume VJ Software. 70 VPT 6.0. 71 Ibid. 72 Cycling 74. 73 Quartz Composer. 74 Figure 53, QLab. 75 Dataton Watchout. 76 Green Hippo. 77 Troika Tronix Isadora Web. Software description. 78 Syphon. 79 Green Hippo. 80 Isadora. Keyboard Watcher Actor Help Description. 81 Spatial Anti-aliasing Wikipedia. 82 Supersampling Wikipedia. 83 Decimal values are rounded up, the largest value is 255 84 Jitter.h, optimized jitter point values. 85 The other jitter point tests can be viewed in the source code appendix. 86 Algorithm created by Angelos Barmpoutis 87 Video clips of animated map effects can be found at htt p://vimeo.com/brittdesign/videos 88 Isadora. Wave Generator Actor Help 89 Isadora. Envelope Generator Actor Help 90 Finley, sample source code. 91 MIDI Wikipedia. 92 Abelton website 93 Beagle, The Last Unicorn, novel 94 The Last Unicorn, animated film. 95 Christie Projectors MSRP. 96 Incandescent light bulb Wikipedia. 97 Photography Credit, "Moire on Parrot Feathers.jpg 98 Moir pattern Wikipedia. 99 Borovay, Web.

PAGE 80

APPENDIX A: Manual Introduction 82 SECTION 1. Isadora Installation 82 1.1 System Requirements 83 1.2 Download 84 1.3 Installation 84 2. FreeFrame Plugins Installation 86 2.1 Download 86 2.2 File Path 86 3. Polygon Mapper Plugin Installation 86 3.1 File Path 86 4. Using the Polygon Mapper 86 4.1 Isadora Graphical User Interface 87 4.1.1 GUI Terminology 87 Scene Editor 87 Toolbox 87 Toolbox Filter 87 Pop-up Toolbox 87 Scene List 87 4.1.2 Other Terminology 88 Actor 88 Stage 88 Patch 88 4.2 Importing Media 89 4.2.1 Acceptable file formats 89 4.2.2 Media Bin 90 4.3 Creating the Patch 90 4.3.1 Retrieving the actors 90 4.3.2 The Actors 92 Media Player 92 Polygon Mapper: Dynamic 92 Projector 92 Stage Mouse Watcher 92 Keyboard Watcher 92 Toggle 92 Pan, Spin, Zoom 92

PAGE 81

! "#! 4.3.3 The Stage Setup 92 4.3.4 Step-by-Step Patch Build 95 Step 1: Add Media Player 95 Step 2: Load Media 95 Step 3: Add Polygon Mapper 96 Step 4: Add Projector 96 Step 5: Add Stage Mouse Watcher 97 Step 6: Add Keyboard Watcher 97 Step 7: Add Toggle 98 4.4 Creating a Polygon (Continued from section 4.3.4) 98 Step 8: Adjust Dot Size 98 Step 9: Add Vertices 98 Step 10: Show Stages 99 Step 11: Turn on Edit Mode 99 Step 12: Map Vertex 99 Step 13: Move to Next Vertex 99 Step 14: Map remaining Vertices 100 Step 15: Turn off Edit Mode 100 Step 16: Turn on Anti-Aliasing 101 4.5 Aligning Media 101 4.6 Adding Effects 102 Footnotes 103

PAGE 82

! "#! This manual details the download, installation and use of Isadora created by Mark Coniglio, FreeF rame plugins created by Pete Warden, and the Polygon Mapper plugin created exclusively for Isadora. The Polygon Mapper plugin was the subject of a project in lieu of thesis presented to the College of Fine Arts of the University of Florida in partial fulfillment of the requirements for the degree of Master of Arts in Digital Arts and Sciences under the title NON-RECTILINEAR PROJECTION DESIGN FOR LIVE CUE-ABLE THEATRICAL PERFORMANCE by Brittany Powell, December 2011. 1. Isadora Installation Isadora is the award-winning, graphic programming environment for Macintosh and Windows that provides interactive control over digital media, with special emphasis on the real time manipulation of digital video. Because every performance or installation is unique, Isadora was designed not to be a "plug and play" program, but instead to offer building blocks that can be linked together in nearly unlimited ways, allowing you to follow your artistic impulse. 1

PAGE 83

! "#! 1.1 System Requirements Please check the system requirements before downloading Isadora. Version System Requirements2 Mac OS X Standard Requires OS X 10.3 or great er Intel or Power PC Processor 1.0 Ghz / 1.0 GB RAM (minimum) 2.0 Ghz / 2.0 GB RAM (recommended) Quicktime 7 minimum, 7.5 recommended Mac OS X Core The Core version of Isadora leverages the Mac OS X operating system by "adopting" all Core Imag e, Quartz Composer, and Core Audio plugins found on your computer, making these modules available within Isadora. (This includes third party plugins, as long as they are installed in the standard locations.) The Core Video feature (Core Image + Quartz Composer plugins) costs US$ 25 per license; the Core Audio features cost an additional US$ 25 per license. You may purchase these upgrades when you order the Mac OS X Standard Version, or you may add them at a later time. Requires OS X 10.4 or greater Intel or Power PC Processor (Core Duo recommended) 1.0 Ghz / 1.0 GB RAM (minimum) 2.0 Ghz / 2.0 GB RAM (recommended) Quicktime 7.5 or greater Windows 7 /Vista /XP Windows 7 / Vista / XP Intel Pentium 4 minimum, Core Duo recommended 1.0 Ghz / 1.0 GB RAM (minimum) 2.0 Ghz / 2.0 GB RAM (recommended) Quicktime 7.5 or greater USB Key Version All Isadora versions may be ordered with an optional USB Key. If you must frequently move your working environment from computer to computer, then purchasing a USB Key may prove useful for you. You may also choose to switch to the USB Key version at a later date. Please make sure to read the USB Key

PAGE 84

! "#! Policy3 before purchasing a key. 1.2 Download latest pre-release version Visit the Troika Tronix website at http://www.troikatronix.com/izzy-download.html to download Isadora. You may download and use the demo version of Isadora for free. It has all the functionality of the full version with the exception of being able to save your work. You must register and purchase a license to enable the save function. Downloading the OS X Core version requires an additional audio and vide o upgrade available for purchase. The latest pre-releases can be found at http://www.troikatronix.com/izzyprereleases.html. Currently, the most recent pre-release is version 1.3 of 19 released on September 29th, 2011.4 1.3 Isadora Installation (If Isadora has already been installed, please skip to section 2.0 on pg. *) The following is the downloading instructions for Mac OS X. Windows Installation will vary slightly. Download the .dmg file for installation. If you do not see the Introduction page automatically, click on the installer icon on the desktop, then click on the Isadora Core Installer button at the bottom of the window. The Introduction begins the installat ion. Press continue.

PAGE 85

! "#! Follow the prompts through the installation process.

PAGE 86

! "#! 2. FreeFrame Plugin Installation (If Isadora has already been installed, please skip to section 3.0 on pg. *) FreeFrame is an open-source cross-platform real-time video effects plugin system.5 2.1 Download and Install For optimal use of the Polygon Mapper download Pete Wardens FreeFrame Plugin update found at http://www.troikatronix.com/izzy-prereleases.html. Once the .dmg file has been downloaded follow the installation guide in 1.3 2.2 File Path The FreeFrame plugins should automatically be put in the proper file path: Library/Application Support/FreeFrame/ 3. Polygon Mapper Installation This plugin will allow for the mapping of objects with a nearly unlimited number of vertices. 3.1 File Path Place the .izzyplug file in the same file as the FreeFrame Plugin. File path: / Library/Application Support/Free Frame/ 4. Using the Polygon Mapper (If you are familiar with the GUI of Isadora, skip to section 4.3) The Polygon Mapper works in conjunction with Isadora actors and FreeFrame plugins to produce a multi-point object mask with a high level of versatility and control.

PAGE 87

! "#! 4.1 Isadora Graphical User Interface Once Isadora, FreeFrame plugins and the Polygon Mapper plugin have been installed restart Isadora. The GUI will appear on your screen. 4.1.1 GUI Terminolog y A. Scene Editor This is where a scene is created. A scene is a container that has several actors connected to create a patch that formulate a certain situation. B. Toolbox This is where all the actors inside Isadora reside. C. Toolbox Filter Filters the actors into 11 different groups based on its functionality i.e. video group, communication group. D. Pop-up Toolbox double clicking inside the scene editor brings up the pop -up toolbox. Type a few letters of the actor you want to place in the scene and the list will populate with all actors fitting the spelling, then arrow down and select enter for the actor to be placed in the scene editor. E. Scene List a time line that holds all the different scenes for your show that you can move linearly or to specific scenes backwards or forwards.

PAGE 88

! ""! 4.1.2 Other Terminology Actor: Also called modules. Actors are visually represented by a dark grey box (blue box when selected) containing inputs and/or outputs that allow for something to appear, an effect to be applied or a change to occur. Stage: Stages are the output screens where the media is sent. Isadora allows for 6 stages in addition to a control screen. Changes to the stages can be made in the Preferences window. Go to Preferences in the Isadora Toolbar / Stage. Each stage can be placed on any display that is connected to the main computer. For example, If you have a c ontrol computer connected to a Projector, stage 1 would be placed on Display 2. The control screen or main display is not considered a stage. Also, you can control the size of the stage at the top of the preferences window or with the drop down menus by each stage. Patch: A Patch is two or more actors connected together by a link.

PAGE 89

! "#! 4.2 Importing Media After you have opened Isadora, you need to import media before you begin mapping. It is not imperative to import all the media or even the final media. You can simply import a placeholder so the mapping can be done. Go to File / Import Media. Shortcut: Command/Shift/I on a Mac Control M on a PC. The browser for importing media will appear. 4.2.1 Acceptable media file formats MOV files should be Quicktime or Photo-jpeg. Be cautious when using H264 movies, especially with a PC. There is a bug in Isadora that may cause it to crash. ** M4V files play audio but they must be played with a Sound Movie Player Actor or a Movie Player Actor. This differs from other Audio files, which must be played by a Sound Player Actor. Image files .JPG .PNG .PDF .PSD Video files .MOV* .AVI .QTZ Audio files .WAV .MP3 .M4V** 3D Files .3DS MIDI Files .MID

PAGE 90

! "#! 4.2.2 Media Bin Once the media has been imported it will be placed in the appropriate bin: Video Files, Audio Files, MIDI Files, Pictures, 3D Models. 4.3 Creating the Patch This next section will explain the steps to build the patch to map a polygon. There are 7 different actors used in the creation of this patch. Please make sure both the P olygon Mapper and the FreeFrame plugins have been installed before continuing. (See sections 2 and 3) 4.3.1 Retrieving the actors As discussed in section 4.1 on the GUI, there are three ways to retrieve an actor to place in the scene editor. The fi rst way is using the toolbox itself. Simply scroll through the list of all the actors in alphabetical order until you find the one you need. Click on the name and a green circle with a plus sign will appear. You do not need to drag and drop the actor. Move the green plus sign into the scene editor and you will see the actor appear next to it. Click the spot inside the editor where you would like the actor to go.

PAGE 91

! "#! The second way is using the toolbox filter. Type the name of th e actor you need in the box and the toolbox will filter the list. Click on the name in the list and drop it into the scene editor.

PAGE 92

! "#! The final and most efficient way to retrieve an actor is to use the pop-up toolbox. Double click inside the scene editor. Be sure to click on the spot where you want the actor to go. Type the first few letters of the actor you want and the filtered list will appear. Arrow to the appropriate actor and press enter. The actor will appear in that exact spot. 4.3.2 The Actors Media Player It can be a Picture or Movie Player. The Picture Player simply outputs a picture imported into the media [bin] as a video stream.6 The Movie Player allows playback control of the movie imported into the media [bin]. The visibility, speed, loop points and the position of the movie can be modified.7 Polygon Mapper: Dynamic The Polygon Mapper allows for a nearly unlimited number of vertices making up a polygon to be dynamically added, mapped and filled with content.

PAGE 93

! "#! Projector The Projector positions, scales and renders a video stream to a specified stage.8 Stage Mouse Watcher The Stage Mouse Watcher sends information about the mouse when it is within the stage specified by the stage input. It reports mouse clicks and releases, the mouse position and whether or not the mouse is within the stage. Keyboard Watcher The Keyboard Watcher looks for keys on the computer keyboard to be pressed, released, or both. The key range input property can be set to limit the range of characters that this watcher will see. When this Watcher sees a character within the specified range, it will send the character that was typed out of the key output.9 Toggle The T oggle actor toggles between an on and off state each time a trigger is received from the trigger input. 10 PanSpinZoom PanSpinZoom is an optional FreeFrame plugin that allows for the manipulation of the media independent of the map. As indicated by its name, this actor can pan, spin or zoom the picture or video file. 4.3.3 The Stage Setup If you are simply testing the Polygon Mapper without a P rojector connected then the stage window will float on the main display.

PAGE 94

! "#! Show s tages by going to Output > Show Stages. Shortcut: Command + g on a Mac Control + g on a PC If you would like this display to be bigger go to Isadora > Preferences > Stage > Stage Size and change the dimensions. Be sure Stage 1 is placed on Display 2 even if you do not have a second display connected. Be sure to SELECT Floating Stage Windows so the stage window is visible when editing the patch. Be sure to DESELECT Hide Cursor When Full Screen under General Stage Options. It is necess ary to see the cursor at all times when using the Polygon Mapper.

PAGE 95

! "#! 4.3.4 Step-by Step Patch Build Step 1: Add a Picture Player to the scene editor. For the purposes of initial mapping use a Picture Player to avoid having a moving images to map. Step 2: Load the image into the Picture P layer. Control + M brings up the media bin. Insert the number 1 in the picture input of the Picture Player and press enter. Note: Use the corresponding number to the desired picture.

PAGE 96

! "#! Step 3: Add a Polygon Mapper Actor to the scene editor. Connect the video out output from the picture to the video in input on the Polygon Mapper by clicking once on the output dot on the Picture Player and then clicking a second time on the input dot. Step 4: Add a Projector to the Scene Editor and connect the video out from the Polygon Mapper to the video in on the Projector.

PAGE 97

! "#! Step 5: Add a Stage Mouse Watcher to the Scene Editor. Connections: Stage Mouse Watcher Polygon Mapper horz. pos mouseX vert. pos mouseY Step 6: Add a Keyboard Watcher to the Scene Editor. Click on the black box to the left of the key range and type in 0-255 and press enter. This allows for all the keys to be watched. Connections: Keyboard Watcher Polygon Mapper key keyCode

PAGE 98

! "#! Step 7: Add Toggle to the Scene Editor. Connections: Stage Mouse Watcher Toggle Polygon Mapper right mouse down trigger trigger out edit 4.4 Creating the Polygon Once the Patch has been completed it is time to start creating the polygon map. Step 8: (Continued from section 4.3.4) Change the dotsize to 5 if you are mapping something large, 1 if something small, or somewhere in between depending on the size or the object. Step 9: Add vertices in the verticesCount input. Click inside the black box to the left of verticies enter a number between 3 and 999. Press enter. For this example we will begin with 6 ver tices. Each vertex is represented by an X and Y input labeled vertex 0 X, vertex 0 Y. Vertex number 1 is referred to as zero.

PAGE 99

! ""! Step 10: Show stages. Command + g / Control + g. The full image will appear with the edit dots edit dots evenly spaced out around the perimeter of the stage. If you do not see the green and red dots, see Step 11. Otherwise, skip to Step 12. Step 11: Turn the edit mode on. Move the mouse over the stage window and right click. You should see the edit dots disappear an d you should see the trigger from the Stage Mouse Watcher trigger the Toggle, which triggers the Edit input of the Polygon Mapper. Practice right clicking to see the edit mode switch on and off. Step 12: Move the mouse over the stage. As the mouse moves, the first vertex X and Y follows the mouse, represented by the green dot The vertex dot will only follow the mouse if these three things are true: One: T he edit mode is on. Two: The mouse is in the stage window. Three: The mouse first touches the top left corner where the dot is. In order for the dot to move with the mouse, the mouse must first touch the location of the dot being edited the red dot. The dot then sticks to the mouse until it is release by changing edit dots. Move the f irst edit dot into the desired position. Step 13: Press the left arrow key to move to the next edit dot. The second vertex X and Y will now turn green. The left and right arrow keys rotate through the vertices. All the dots are red unless they are being edited.

PAGE 100

! "##! Step 14: Repeat steps 11 and 12 until all the dots are mapped to the corners of the cube. (For practice just create a shape.) Note: As you map vertices you will notice the values on the Stage Mouse Watcher being outputted to the mouseX and mouseY inputs of the Polygon Mapper. Those values will then be transferred to the appropriate values for the vertex being edited. Step 15: Right click on the stage window to turn off edit mode. All the dots will disappear and you r map is complete.

PAGE 101

! "#"! Step 16: Anti-aliasing should only be added once the map is complete. There are eight levels of anti-aliasing: 0-7. Choose the proper anti-aliasing for your map. Note: Level 4 should suffice for most maps. Going any higher may create lag with your media. 4.5 Aligning Media For optimum control over the map add the FreeFrame Plugin actor PanSpinZoom between the media player and the Polygon Mapper.

PAGE 102

! "#$! SPIN ZOOM PAN 4.6 Adding Effects Several effects can be added to the media inside the polygon. These actors must be placed in between PanSpinZoom and Polygon Mapper. More than one effect can be added at the same time as well by linking multiple effects together and then connecting them as a whole in between PanSpinZoom and the Polygon Mapper. Most of the effects used in Isadora are FreeFrame Plugins.

PAGE 103

! "#$! Examples of other effects: Solarize Glow Kaleidoscope Burn TV Pixel Dots !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!! "!%&'()*'! +!,*)-.'!,)/-01!&)234'*5!(5&6*-73-)/8! !9!!"#$%!$!!"#$%&!:;!?)@-6>8! !A!!"#$%&'?*5B*5@5'&5!/)35& 'C!D*55D*'E51!45F&-358!G!%&'()*'1!?-63H*5!?@'>5*!I63)*!J5@7!K5&6*-73-)/8!L!%&'()*'1!M)N-5!?@'>5*!I63)*!J5@7!K5&6*-73-)/8!O!%&'()*'1!?*)P563)*!I63)*!J5@7!K5&6*-73-)/8!Q!%&'()*'1!=5>F)'*(!R'36S5*!I63)*!J5@7!K5&6*-73-)/8!"#!%&'()*'1!,)TT@5!I63)*!J5@7!K5&6*-73-)/8!

PAGE 104

! "#$! //As of 1023 // =========================================================================== // Isadora Demo Plugin Mark F. Coniglio. All rights reserved. // =========================================================================== // // IMPORTANT: This source code ("the software") is supplied to you in // consideration of your agreement to the following terms. If you do not // agree to the terms, do not inst all, use, modify or redistribute the // software. // // Mark Coniglio (dba TroikaTronix) grants you a personal, non exclusive // license to use, reproduce, modify this software with and to redistribute it, // with or without modifications, in source and/or binary form. Except as // expressly stated in this license, no other rights are granted, express // or implied, to you by TroikaTronix. // // This software is provided on an "AS IS" basis. TroikaTronix makes no // warranties, express or implied, including without limitation the implied // warranties of noninfringement, merchantability, and fitness for a // particular purpose, regarding this software or its use and operation // alone or in combination with your products. // // In no event shall TroikaTronix be liable for any special, indirect, incidental, // or consequential damages arising in any way out of the use, reproduction, // modification and/or distribution of this software. // // =========================================================================== // // CUSTOMIZING THIS SOURCE CODE // To customize this file, search for the text ###. All of the places where // you will need to customize the file are marked with this pattern o f // characters. // // ABOUT IMAGE BUFFER MAPS: // // The ImageBufferMap structure, and its accompanying functions, // exists as a convenience to those writing video processing plugins. // // Basically, an image buffer contains an arbitrary number of inpu t and // output buffers (in the form of ImageBuffers). The ImageBufferMap code // will automatically create intermediary buffers if needed, so that the // size and depth of the source image buffers sent to your callback are // the same for all buffers. // // Typically, the ImageBufferMap is created in your CreateActor function, // and dispose in the DiposeActor function. // --------------------------------------------------------------------------------// INCLUDES // -------------------------------------------------------------------------------#include "IsadoraTypes.h" #include "IsadoraCallbacks.h" #include "ImageBufferUtil.h" #include "PluginDrawUtil.h" // STANDARD INCLUDES #include #include // --------------------------------------------------------------------------------// MacOS Specific // --------------------------------------------------------------------------------#if TARGET_OS_MAC #define EXPORT_ #endif APPENDIX B Polygon Mapper Source Code

PAGE 105

! "#$! // -------------------------------------------------------------------------------// Win32 Specific // --------------------------------------------------------------------------------#if TARGET_OS_WIN32 #include //added to windows line, will not complie in mac if in standard// #define EXPORT_ __declspec(dllexport) #ifdef __cplusplus extern "C" { #endif BOOL WINAPI DllMain ( HINSTANCE hInst, DWORD wDataSeg, LPVOID lpvReserved ); #ifdef __cplusplus } #endif BOOL WINAPI DllMain (HINSTANCE /* hInst */,DWORD wDataSeg, LPVOID /* lpvReserved */) { switch(wDataSeg) { case DLL_PROCESS_ATTACH: return 1 ; break; case DLL_PROCESS_DETACH: break; default: return 1 ; break; } return 0 ; } #endif // --------------------------------------------------------------------------------// Exported Function Definitions // --------------------------------------------------------------------------------#ifdef __cplusplus // non command part if computer understands c++, the SDK will use C// extern "C" { #endif EXPORT_ void GetActorInfo(void* inParam, ActorInfo* outActorParams); // takes out part of c++ if computer //is not c++// #ifdef __cplusplus } #endif // must be included from here up// // --------------------------------------------------------------------------------// FORWARD DECLARTIONS //PROGRAM STARTS// TYPES and NAMES // staic is one way to achieve communication between multiple actors, making it available to all// // --------------------------------------------------------------------------------static void // void means there is no output// ReceiveMessage( IsadoraParameters* ip, MessageMask inMessageMask, PortIndex inPortIndex1, const MsgData* inData, UInt32 inLen, long inRefCon); // --------------------------------------------------------------------------------

PAGE 106

! "#$! // GLOBAL VARIABLES // --------------------------------------------------------------------------------// ### Declare global variables, common to all instantiations of this plugin here // Example: static int gMyGlobalVariable = 5; // --------------------------------------------------------------------------------// PluginInfo struct // -------------------------------------------------------------------------------// ### This structure neeeds to contain all variables used by your plugin. Memory for // this struct is allocated during the CreateActor function, and disposed during // the DisposeActor function, and is private to each copy of the plugin. // // If your plugin needs global data, declare them as static variables within this // file. Any static variable will be global to all instantiations of the plugin. typedef struct { ActorInfo* mActorInfoPtr; // our ActorInfo Pointer set during createactorfn MessageReceiverRef mMessageReceiver; // pointer to our message receiver reference Boolean mNeedsDraw; // set true when the video output needs to be drawn ImageBufferMap mImageBufferMap; // used by most video plugins Boolean mBypass; Boolean mEditMode; //turns on/off edit dots// int mEditSize; //size of dot int mVerticesCount; // vertices count int mVerticesIndex; // current vertices index int mVertexCoordinateMin; int mVertexCoordinateMax; float mCurrentWidth; float mCurrentHeight; int mCurrentVertexIndex; int mLastKeyCode; float* mCoordinates; // polygon vertices is added for float to define an empty array of unknown size syntax called pointer// int* AlphaMask; // point in polygon alpha mask saved in here int AlphaMaskWidth; int AlphaMaskHeight; Boolean RecalulateMask; // trigger the need for a recalulation of the mask Boolean justStartedEditingX; Boolean justStartedEditingY; int anti_aliasing; //optimized antialiasing Value valuesInit[128]; Value valuesMin[128]; Value valuesMax[128]; } PluginInfo; static void addInputProperty( IsadoraParameters* ip, ActorInfo* inActorInfo, PluginInfo* info, const char* nameTemplate, const int index ); static void removeInputProperty( IsadoraParameters* ip,

PAGE 107

! "#$! ActorInfo* inActorInfo, const PluginInfo* info ); // A handy macro for casting the mActorDataPtr to PluginInfo* #if __cplusplus #define GetPluginInfo_(actorDataPtr) static_cast((actorDataPtr) >mActorDataPtr); #else #define GetPluginInfo_(actorDataPtr) (PluginInfo*)((actorDataPtr) >mActorDataPtr); #endif // --------------------------------------------------------------------------------// Constants // --------------------------------------------------------------------------------// Defines various constants used throughout the plugin. // ### GROUP ID // Define the group under which this plugin will be displayed in the Isadora interface. // These are defined under "Actor Types" in IsadoraTypes.h static const OSType kActorClass = kGroupVideo; // ### PLUGIN IN // Define the plugin's unique four character identifier. Contact TroikaTronix to // obtain a unique four character code if you want to ensure that someone else // has not developed a plugin with the same code. Note that TroikaTronix reserves // all plugin codes that begin with an unde rline, an atsign, and a pound sign // (e.g., '_', '@', and '#'.) static const OSType kActorID = FOUR_CHAR_CODE( 'PR12'); // ### ACTOR NAME // The name of the actor. This is the name that will be shown in the User Interface. static const char* kActorName = "Polygon Mapper: Dynamic" ; // ### PROPERTY DEFINITION STRING // The property string. This string determines the inputs and outputs for your plugin. // See the IsadoraCallbacks.h under the heading "PROPERTY DEFINITION STRING" for the // meaning ofthese codes. (The IsadoraCallbacks.h header can be seen by opening up // the IzzySDK Framework while in the Files view.) // // IMPORTANT: You cannot use spaces in the property name. Instead, use underscores (_) // where you want to have a space. // // Note that each line ends with a carriage return ( \ r), and that only the last line of // the bunch ends with a semicolon. This means that what you see below is one long // nullterminated cstring, with the individual lines separated by carriage retur ns. static const char* sPropertyDefinitionString = // INPUT PROPERTY DEFINITIONS // TYPE PROPERTY NAME ID DATATYPE DISPLAY FMT MIN MAX INIT VALUE "INPROP video_in vin data video 0 \ r" "INPROP mouseX mosX float number 0 \ r" "INPROP mouseY mosY float number 0 \ r" "INPROP keyCode keyC int number 0 \ r" "INPROP antiAliasing anti int number 0 7 0 \ r" "INPROP fValue fVal float number 0 10240 1 0 1 \ r" "INPROP edit edit bool onoff 0 1 1 \ r" "INPROP dotsize dot int number 0 5 0 \ r" "INPROP verticesCount vcnt int number,mutable1 0 999 0 1 \ r" // OUTPUT PROPERTY DEFINITIONS // TYPE PROPERTY NAME ID DATATYPE DISPLAY FMT MIN MAX INIT VALUE "OUTPROP video_out vout data video 0 \ r"; // ### Property Index Constants

PAGE 108

! "#$! // Properties are referenced by a one based index. The first input property will // be 1, the second 2, etc. Similarly, the first output property starts at 1. // You whould have one constant for each input and output property defined in the // property definition string. enum //just labeling the input/output// { kInputVideoIn = 1 kInputMouseX, kInputMouseY, kInputKeyCode, kAntiAliasing, kInputFValue, kInputEditMode, kInputEditSize, kInputVerticesCount, kOutputVideo = 1 }; // --------------------// Help String // --------------------// ### Help Strings // // The first help string is for the actor in general. This followed by help strings // for all of the inputs, and then by the help strings for all of the outputs. These // should be given in the order that they are defined in the Property Definition // String above. // // In all, the total number of help strings should be (num inputs + num outputs + 1) // // Note that each string is followed by a comma -it is a common mistake to forget the // comma which results in the two strings being concatenated into one. const char* sHelpStrings[] = { "Polygon Mapper allows for vertices in a polygon to be mapped and filled with content." "This version of the actor allows for 6 points at a resolution of 640x480. For additional points and different resolutions, new versions of the actor will be programmed." "The video source that will fill the map. It can be a video, picture, shape, etc .", "Mouse X position", "Mouse Y position", "AntiAliasing Values", "fValue", "Edit: turns the edit dots on and off" "Edit Size: changes the value of the edit dots from 0 pixels 5 pixels", "Vertices count", "The mapped video output." }; // --------------------------------------------------------------------------------// CreateActor // standard create and destroy an actor// // --------------------------------------------------------------------------------// Called once, prior to the first activation of an actor in its Scene. The // corresponding DisposeActor actor function will not be called until the file // owning this actor is closed, or the actor is destroyed as a result of being // cut or deleted. static void CreateActor( IsadoraParameters* ip, ActorInfo ioActorInfo) // pointer to this actor's ActorInfo struct unique t o each instance of an actor { // creat the PluginInfo struct initializing it to all zeroes PluginInfo* info = (PluginInfo*) IzzyMallocClear_(ip, sizeof( PluginInfo));

PAGE 109

! "#$! info> mCoordinates = (float*) IzzyMallocClear_(ip, sizeof( float) 999 2 ); PluginAssert_(ip, info != nil); ioActorInfo> mActorDataPtr = info; info> mActorInfoPtr = ioActorInfo; // ### allocation and initialization of private member variables // set number of input and output buffers in our buffer map // and then initialize it info> mImageBufferMap. mInputBufferCount = 1 ; info> mImageBufferMap. mOutputBufferCount = 1 ; info> mVertexCoordinateMin = 0 ; info> mVertexCoordinateMax = 10240; info> AlphaMaskWidth= 1 ; info> AlphaMaskHeight= 1 ; info> AlphaMask= NULL; info> RecalulateMask= true; CreateImageBufferMap (ip, &info> mImageBufferMap); } // --------------------------------------------------------------------------------// DisposeActor // --------------------------------------------------------------------------------// Called when the file owning this actor is closed, or when the actor is destroyed // as a result of its being cut or deleted. // static void DisposeActor( IsadoraParameters* ip, ActorInfo* ioActorInfo) // pointer to this actor's ActorInfo struct unique to each instance of an actor { PluginInfo* info = GetPluginInfo_(ioActorInfo); PluginAssert_(ip, info != nil); // ### destruction of private member variables // destroy our image buffer map DisposeImageBufferMap (ip, &info> mImageBufferMap); // destroy the PluginInfo struct allocated with IzzyMa llocClear_ the CreateActor function PluginAssert_(ip, ioActorInfo> mActorDataPtr != nil); IzzyFree_(ip, info> mCoordinates); IzzyFree_(ip, ioActorInfo> mActorDataPtr); } // -------------------------------------------------------------------------------// CreatePropertyID [INTERRUPT SAFE] // --------------------------------------------------------------------------------inline OSType CreatePropertyID( IsadoraParameters* ip, const char* inRateBase, SInt32 inIndex) { const SInt32 kOneCharMax = 26; const SInt32 kTwoCharMax = kOneCharMax kOneCharMax; PluginAssert_(ip, inRateBase[0 ] != 0 && inRateBase[1 ] != 0 ); PluginAssert_(ip, inIndex >= 0 && inIndex < kTwoCharMax 2 ); OSType result = (((UInt32) inRateBase[0 ]) << 24) | (((UInt32) inRateBase[1 ]) << 16);

PAGE 110

! ""#! SInt32 indexLS; SInt32 indexMS; SInt32 indexOffset; // in index is between 00 and 99 if (inIndex >= 0 && inIndex < 100) { indexMS = inIndex / 10; indexLS = inIndex % 10; result |= ( (((UInt32) (indexMS + '0')) << 8 ) | (((UInt32) (indexLS + '0')) << 0 ) ); // if between 100 and 776 } else if (inIndex >= 100 && inIndex < 100 + kTwoCharMax) { indexOffset = inIndex 100; PluginAssert_(ip, indexOffset >= 0 && indexOffset < kTwoCharMax) ; indexMS = indexOffset / kOneCharMax; indexLS = indexOffset % kOneCharMax; result |= ( (((UInt32) (indexMS + 'A')) << 8 ) | (((UInt32) (indexLS + 'A')) << 0 ) ); // if between 776 and 1452 } else if (inIndex >= 100 + kTwoCharMax && inIndex < 100 + kTwoCharMax 2 ) { indexOffset = inIndex ( 100 + kTwoCharMax); PluginAssert_(ip, indexOffset >= 0 && indexOffset < kTwoCharMax); indexMS = indexOffset / kOneCharMax; indexLS = indexOffset % kOneCharMax; result |= ( (((UInt32) (indexMS + 'a')) << 8 ) | (((UInt32) (indexLS + 'a')) << 0 ) ); } else { PluginAssert_(ip, false); } return result; } // --------------------------------------------------------------------------------// ActivateActor // -------------------------------------------------------------------------------// Called when the scene that owns this actor is activated or deactivated. The // inActivate flag will be true when the scene is activated, false when deactivated. // static void ActivateActor( IsadoraParameters* ip, ActorInfo* inActorInfo, // pointer to this actor's ActorInfo struct unique to each instance of an actor Boolean inActivate) // true when actor is becoming active, else false { PluginInfo* info = GetPluginInfo_(inActorInfo); // -----------------------// ACTIVATE // -----------------------if (inActivate) { // Isadora passes various messages to plugins that request them. // These include Mouse Moved messages, Key Down/Key Up messages, // Video Frame Clock messages, etc. The complete list can be found // in the enumeration in MessageReceiverCommon.h // You ask Isadora for these messages by calling CreateMessageReceiver_ // with a pointer to your function, and the me ssage types you would // like to receive. (These are bitmapped flags, so you can combine as

PAGE 111

! """! // many as you like: kWantKeyDown | kWantKeyDown for instance.) // Here we request that our ReceiveMessage function is called // whenever the Isadora New Video Frame message is sent, // which happens periodically, 30 times per second. We set the ref // con to our ActorInfo ptr so that we can access that information // from ReceiveMessage callback. MessageReceiveFunction* msgRcvFunc = ReceiveMessage; // if the "bypass" flag is off, then we want to receive messages // if (info>mBypass == false) { // we should not already have a message receiver PluginAssert_(ip, info> mMessageReceiver == nil); // create a message rece iver that will be notified of // video frame ticks info> mMessageReceiver = CreateMessageReceiver_ ( ip, msgRcvFunc, 0 kWantVideoFrameTick, ( long) inActorInfo); // } // set the needs draw flag so that we will be drawn as soon // as possible info> mNeedsDraw = true; // -----------------------// DEACTIVATE // -----------------------} else { // dispose our message receiver when we are deactivated. if (info> mMessageReceiver != nil) { DisposeMessageReceiver_ (ip, info> mMessageReceiver); info> mMessageReceiver = nil; info> mNeedsDraw |= true; } // ### dispose any data that you don't need when // you are not active. DisposeOwnedImageBuffers (ip, &info> mImageBufferMap); ClearSourceBuffers(ip, &info> mImageBufferMap); } } // --------------------------------------------------------------------------------// GetParameterString // -------------------------------------------------------------------------------// Returns the property definition string. Called when an instance of the actor // needs to be instantiated. static const char* GetParameterString( IsadoraParameters* /* ip */, ActorInfo* /* inActorInfo */) { return sPropertyDefinitionString ; } // --------------------------------------------------------------------------------// GetHelpString // --------------------------------------------------------------------------------// Returns the help string for a particular property. If you have a fixed number of // input and output properties, it is best to use the PropertyTypeAndIndexToHelpIndex_

PAGE 112

! ""#! // function to determine the correct help string to return. static void GetHelpString( IsadoraParameters* ip, ActorInfo* inActorInfo, PropertyType inPropertyType, //kPropertyTypeInvalid w hen requesting help for the actor P ropertyIndex inPropertyIndex1,// the onebased index of the property (when inPropertyType is not kPropertyTypeInvalid) char* outParamaterString, // receives the help string UInt32 inMaxCharacters) // size of the outParamaterString buffer { const char* helpstr = nil; // The PropertyTypeAndIndexToHelpIndex_ converts the inPropertyType and // inPropertyIndex1 parameters to determine the zero based index into // your list of help strings. if ((inPropertyType == kInputProperty) && (inPropertyIndex1 > kInputVerticesCount )) { // copy it to the output string strncpy(outParamaterString, "Vertex coortinates" inMaxCharacters); } else { UInt32 index1 = PropertyTypeAndIndexToHelpIndex_ (ip, inActorInfo, inPropertyType, inPropertyIndex1); // get the help string helpstr = sHelpStrings[index1]; // copy it to the output string strncpy(outParamaterString, helpstr, inMaxCharacters); } } void setInitialCoordinates( float* points,int numofpoints) { int m=(numofpoints)%(4 ); int n=(numofpointsm)/4 ; int i; int l; int c=0 ; //upper if(m>0 ) l=n+1 ; else l=n; for(i=0 ;i1 ) l=n+1 ; else l=n; for(i=0 ;i2 ) l=n+1 ; else l=n; for(i=0 ;i
PAGE 113

! ""#! //left if(m>3 ) l=n+1 ; else l=n; for(i=0 ;icurrent_point;i--) { points[(i+numofnewpoints numofpoints)*2 + 1 ]=points[i*2 + 1 ]; points[(i+numofnewpoints numofpoints)*2 + 0 ]=points[i*2 + 0 ]; } //new points for(i=0 ;i<(numofnewpointsnumofpoints);i++) { points[(current_point+ 1 +i)*2 + 0 ]=points[current_point* 2 + 0 ]+dx*(i+1.0)/(numofnewpointsnumofpoints+1.0); points[(current_point+ 1 +i)*2 + 1 ]=points[current_point* 2 + 1 ]+dy*(i+1.0)/(numofnewpointsnumofpoints+1.0); } } // --------------------------------------------------------------------------------// HandlePropertyChangeValue [INTERRUPT SAFE] // -------------------------------------------------------------------------------// ### This function is called whenever one of the input values of an actor changes. // The onebased property index of the input is given by inPropertyIndex1. // The new value is given by inNewValue, the pre vious value by inOldValue. // static void HandlePropertyChangeValue( IsadoraParameters* ip, ActorInfo* inActorInfo, PropertyIndex inPropertyIndex1, // the onebased index of the property than //changed values ValuePtr /* inOldValue */, // the property's old value ValuePtr inNewValue, // the property's new value Boolean inInitializing) // true if the value is being set when an //actor is first initalized { PluginInfo* info = GetPluginInfo_(inActorInfo); // ### When you add/change/remove properties, you will need to add cases // to this switch statement, to process the messages for your // input properties // The value comes to you encapsulated in a Value structure. See // ValueCommon.h for details about the contents of this structure.

PAGE 114

! ""#! switch (inPropertyIndex1) { case kInputVideoIn: { // if bypass is off, then we store the incoming video frame reference // into our image buffer -it will be processed when our ReceiveMessage // function next receives a kWantVideoFrameTick message //if (info>mBypass == false) { SetImageBufferValue(ip, &info> mImageBufferMap, 0 GetDataValueOfType(inNewValue, k ImageBufferDataType, ImageBufferPtr)); // set mNeedsDraw flag to ensure that new video image is drawn info> mNeedsDraw |= true; // if bypass is on, we simply send the incoming video frame reference // on to the output -bypassing all processing entirely. } break; case kInputEditMode: { info> justStartedEditingX= false; info> justStartedEditingY= false; info> mEditMode = inNewValue> u ivalue; info> mNeedsDraw = true;} break; case kInputEditSize: {info> mEditSize = inNewValue> u ivalue; info> mNeedsDraw = true;} break; case kInputKeyCode: if (inNewValue> u ivalue == 0 ) break; if (inNewValue> u ivalue == 28) { info> justStartedEditingX= false; info> justStartedEditingY= false; info> mCurrentVertexIndex--; if (info> mCurrentVertexIndex < 0 ) info> mCurrentVertexIndex = (info> mVerticesCount / 2 ) 1 ; } else if (inNewValue> u ivalue == 29) { info> justStartedEditingX= false; info> justStartedEditingY= false; info> mCurrentVertexIndex++; if (info> mCurrentVertexIndex >= (info> mVerticesCount / 2 )) info> mCurrentVertexIndex = 0 ; } if (info> mCurrentVertexIndex < 0 ) info> mCurrentVertexIndex = 0 ; Value v; v.type = kInteger; v.u ivalue = 0 ; SetInputPropertyValue_ (ip, inActorInfo, inPropertyIndex1, &v); info> mNeedsDraw |= true; break; case kInputMouseX: if (info> mEditMode) { float value = inNewValue> u fvalue

PAGE 115

! ""#! Value v; v.type = kFloat; v.u fvalue = value; if((info> justStartedEditingX ==true)&&(info> justStartedEditingY==true)){ info> mCoordinates[(info> mCurrentVertexIndex 2 ) + 0 ] = value; int index = (kInputVerticesCount + (info> mCurrentVertexIndex 2 ) + 0 + 1 ); SetInputPropertyValue_ (ip, inActorInfo, index, &v); } else if( abs(valueinfo> mCoordinates[(info> mCurrentVertexIndex 2 ) + 0 ])<5 ) info> justStartedEditingX= true; info> mNeedsDraw |= true; info> RecalulateMask= true; } break; case kInputMouseY: if (info> mEditMode) { float value = inNewValue> u fvalue Value v; v.type = kFloat; v.u fvalue = value; if((info> justStartedEditingX ==true)&&(info> justStartedEditingY==true)){ info> mCoordinates[(info> mCurrentVertexIndex 2 ) + 1 ] = value; int index = (kInputVerticesCount + (info> mCurrentVertexIndex 2 ) + 1 + 1 ); SetInputPropertyValue_ (ip, inActorInfo, index, &v); } else if( abs(valueinfo> mCoordinates[(info> mCurrentVertexIndex 2 ) + 1 ])<5 ) info> justStartedEditingY= true; info> mNeedsDraw |= true; info> RecalulateMask= true; } break; case kAntiAliasing: {info> anti_aliasing= inNewValue> u ivalue; info> mNeedsDraw = true;info> RecalulateMask= true;} break; case kInputBypass: // store member variable for on/off info> mBypass = (inNewValue> u ivalue != 0 ); // if "bypass" is going from on to off, we need to // reallocate our message receiver so that we will // start receiving video frame tick messages again if (info> mBypass == false) { if (info> mMessageReceiver == nil) { info> mMessageReceiver = CreateActorMessageReceiver_ ( ip, inActorInfo, ReceiveMessage, 0 kWantVideoFrameTick,

PAGE 116

! ""#! ( long) inActorInfo); } // if "bypass" is going from off to on, then we want to // stop processing video. We dispose our message receiver // here to save processing power -when bypass is "on" the // incoming video is sent directly to the output -see the // case kInputVideoIn abo ve. } else { if (info> mMessageReceiver != nil) { DisposeMessageReceiver_ (ip, info> mMessageReceiver); info> mMessageReceiver = nil; info> mNeedsDraw |= true; } } break; case kInputVerticesCount: int newCount = inNewValue > u ivalue; newCount *= 2 ; if (newCount == info> mVerticesCount) break; if(info> mVerticesCount>=3 && newCount>info>mVericesCount) { setNewCoordinates(info > mCoordinates,info> mCurrentVertexIndex,info>mVericesCount/2 ,newCount/2 ); } else if(info> mVerticesCount< 3 && newCount>0 ) { setInitialCoordinates (info> mCoordinates,newCount/2 ); } if (inInitializing) { info> mVerticesCount = newCount; break; } if (newCount > info> mVerticesCount) { while (info> mVerticesCount < newCount) { Value v; v.type = kFloat; addInputProperty( ip, inActorInfo, info, "vertex %d X", info> mVerticesCount ); v.u fvalue = info> mCoordinates[info> mVerticesCount]; SetInputPropertyValue_ (ip, inActorInfo, (kInputVerticesCount + (info> mVerticesCount ) + 0 + 1 ), &v); info> mVerticesCount++; addInputProperty( ip, i nActorInfo, info, "vertex %d Y", info> mVerticesCount ); v.u fvalue = info> mCoordinates[info> mVerticesCount]; SetInputPropertyValue_ (ip, inActorInfo, (kInputVerticesCount + (info> mVerticesCount ) + 0 + 1 ), &v);

PAGE 117

! ""#! info> mVerticesCount++; } } else { while (info> mVerticesCount > newCount) { removeInputProperty( ip, inActorInfo, info ); info> mVerticesCount--; } } break; } if (inPropertyIndex1 > kInputVerticesCount) { int index = inPropertyIndex1 ( kInputVerticesCount + 1 ); info> mCoordinates[index] = (float) (inNewValue> u fvalue); info> mNeedsDraw = true;info> RecalulateMask= true; } } static void removeInputProperty( IsadoraParameters* ip, ActorInfo* inActorInfo, const PluginInfo* info ) { IzzyError err = RemovePropertyProc_( ip, inActorInfo, kInputProperty, kInputVerticesCount + 0 + info> mVerticesCount ); } static void addInputProperty( IsadoraParameters* ip, A c torInfo* inActorInfo, PluginInfo* info, const char* nameTemplate, const int index1 ) { //Value valueInit; info> valuesInit[info> mVerticesCount].type = kFloat; info> valuesInit[info> mVerticesCount].u fvalue = 0 ; PropertyDispFormat availFmts; PropertyDispFormat curFmt; if ( GetPropertyDisplayFormats_ (ip, inActorInfo, kInputProperty, kInputFValue, &availFmts, &curFmt) != noErr) { } int index = kInputVerticesCount + 1 + info> mVerticesCount; OSType rateType = CreatePropertyID(ip, "it", index); char propertyName[128]; sprintf(propertyName, nameTemplate, info > mVerticesCount / 2 );

PAGE 118

! ""#! //Value valueMin; //Value valueMax; info> valuesMin[info> mVerticesCount].type = kFloat; info> valuesMax[info> mVerticesCount].type = kFloat; info> valuesMin[info> mVerticesCount].u fvalue = info> mVertexCoordinateMin ; info> valuesMax[info> mVerticesCount].u fvalue = info> mVertexCoordinateMax ; IzzyError err = AddProperty_( ip, inActorInfo, kInputProperty, rateType, // the input type FOUR_CHAR_CODE( 'fVal'), // the input to which we will conform propertyName, availFmts, curFmt, 1 &(info> valuesMin[info> mVerticesCount]), &(info> valuesMax[info> mVerticesCount]), &(info> valuesInit[info> mVerticesCount]) ); if (err != noErr) { } PluginAssert_(ip, err == noErr); } // --------------------------------------------------------------------------------// GetActorDefinedArea // --------------------------------------------------------------------------------// If the mGetActorDefinedAreaProc in the ActorInfo struct points to this function, // it indicates to Isadora that the object would like to draw either an icon or else // an graphic representation of its function. // // ### This function uses the 'PICT' 0 resource stored with the plugin to draw a n icon. // You should replace this picture (located in the Plugin Resources.rsrc file) with // the icon for your actor. // static ActorPictInfo gPictInfo = { false, nil, nil, 0 0 }; static Boolean GetActorDefinedArea( IsadoraParameters* ip, ActorInfo* inActorInfo, SInt16* outTopAreaWidth,// returns the width to reserve for the top Actor Defined Area SInt16* outTopAreaMinHeight, // returns the minimum height of the top area SInt16* outBotAreaHeight, // returns the wd to reserve for bottom Actor Defined Area SInt16* outBotAreaMinWidth) // returns the minimum width of the bottom area { if (!gPictInfo. mInitialized) { PrepareActorDefinedAreaPict_ (ip, inActorInfo, 0 &gPictInfo); } // place picture in top area *outTopAreaWidth = gPictInfo. mWidth; *outTopAreaMinHeight = gPictInfo. mHeight; // don't draw anything in bottom area *outBotAreaHeight = 0 ; *outBotAreaMinWidth = 0 ; return true; }

PAGE 119

! ""#! // --------------------------------------------------------------------------------// DrawActorDefinedArea // --------------------------------------------------------------------------------// If GetActorDefinedArea is defined, then this function will be called whenever // your ActorDefinedArea needs to be drawn. // // Beacuse we are using the PICT 0 resource stored with this plugin, we can use // the DrawActorDefinedAreaPict_ supplied by the Isadora callbacks. // // DrawActorDefinedAreaPict_ is Alpha Channel aware, so you can have nice // shading if you like. static void DrawActorDefinedArea( IsadoraParameters* ip, ActorInfo* inActorInfo, void* /* inDrawingContext */ // unused at present ActorDefinedAreaPart inActorDefinedAreaPart, // the part of the actor that needs to be drawn ActorAreaDrawFlagsT /* inAreaDrawFlags */ // actor draw flags Rect* inADAArea, // rect enclosing the entire Actor Defined Area Rect* /* inUpdateArea */, / / subset of inADAArea that needs updat ing Boolean inSelected) // TRUE if actor is currently selected { if (inActorDefinedAreaPart == kActorDefinedAreaTop && gPictInfo. mInitialized) { DrawActorDefinedAreaPict_ (ip, inActorInfo, inSelected, inADAArea, & gPictInfo); } } // --------------------------------------------------------------------------------// GetActorInfo // --------------------------------------------------------------------------------// This is function is called by to get the actor's class and ID, and to get // pointers to the all of the plugin functions declared locally. // // All members of the ActorInfo struct pointed to by outActorParams have been // set to 0 on entry. You only need set functions defined by your plugin // EXPORT_ void GetActorInfo( void* /* inParam */, ActorInfo* outActorParams) { // REQUIRED information outActorParams> mActorName = kActorName; outActorParams> mClass = kActorClass; outActorParams> mID = kActorID; outActorParams> mCompatibleWithVersion = kCurrentIsadoraCallbackVersion ; // REQUIRED functions outActorParams> mGetActorParameterStringProc = GetParameterString; outActorParams> mGetActorHelpStringProc = GetHelpString; outActorParams> mCreateActorProc = CreateActor; outActorParams> mDisposeActorProc = DisposeActor; outActorParams> mActivateActorProc = ActivateActor; outActorParams> mHandlePropertyChangeValueProc = HandlePropertyChangeValue; // OPTIONAL FUNCTIONS outActorParams> mHandlePropertyChangeTypeProc = NULL; outActorParams> mHandlePropertyConnectProc = NULL; outActorParams> mPropertyValueToStringProc = NULL; outActorParams> mPropertyStringToValueProc = NULL; outActorParams> mGetActorDefinedAreaProc = GetActorDefinedArea; outActorParams> mDrawActorDefinedAre aProc = DrawActorDefinedArea; outActorParams> mMouseTrackInActorDefinedAreaProc = NULL; } // --------------------------------------------------------------------------------// ProcessVideoFrame // --------------------------------------------------------------------------------

PAGE 120

! "#$! // ### This is the code that does the actual processing of a video frame. Modify // this code to create your own filter. // int pointInPolygon(float* coordinates, int polySides, float xx, float yy,int width,int height) { int i, j = polySides 1 ; //delete to occlude polygon// bool oddNodes = false; float x=xx*100.0/width; float y=yy*100.0/height; for (i=0 ; i < polySides; i++) { if ( (coordinates[(i 2 ) + 1 ] < y && coordinates[(j 2 ) + 1 ] >= y) || (coordinates[(j 2 ) + 1 ] < y && coordinates[(i 2 ) + 1 ] >= y) ) { if ( ( coordinates[(i 2 ) + 0 ] + (y coordinates[(i 2 ) + 1 ]) / (coordinates[(j 2 ) + 1 ] coordinates[(i 2 ) + 1 ]) (coordinates[(j 2 ) + 0 ] coordinates[(i 2 ) + 0 ]) ) < x ) { oddNodes=!oddNodes; } } j=i; } if(oddNodes==false) return 0 ; else return 1 ; } void MaskRecalculation(IsadoraParameters* ip, PluginInfo* info, int width, int height) { if(info> AlphaMask!=NULL) IzzyFree_(ip, info> AlphaMask); info> AlphaMask = (int*) IzzyMallocClear_(ip, sizeof( int) width height); int polySize = info> mVerticesCount / 2 ; // for each row SInt16 row = 0 ; while (row < height) { // and for each column in that row SInt16 col = 0 ; while (col < width) { int sum=0 ; int num_of_tests=1 ; if(info> anti_aliasing==0 ) { sum=pointInPolygon(info> mCoordinates polySize, col, row,width,height); num_of_tests=1 ; } else if(info> anti_aliasing==1 ) { sum= pointInPolygon(info> mCoordinates, polySize, col+0.246490, row+0.249999,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.246490, row0.249999,width,height); num_of_tests=2 ; }

PAGE 121

! "#"! else if(info> anti_aliasing==2 ) { sum= pointInPolygon(info> mCoordinates, polySize, col0.373411, row0.250550,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.256263, row+0.368119,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.117148, row0.117570,width,height); num_of_tests=3 ; } else if(info> anti_aliasing==3 ) { sum= pointInPolygon(info> mCoordinates, polySize, col0.208147, row+0.353730,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.203849, row0.353780,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.292626, row0.149945,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.296924, row+0.149994,width,height); num_of_tests=4 ; } else if(info> anti_aliasing==4 ) { sum= pointInPolygon(info> mCoordinates, polySize, col0.334818, row+0.435331,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.286438, row0.393495,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.459462, row+0.141540,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.414498, row0.192829,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.183790, row+0.299133,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.079263, row0.317383,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.102254, row+0.353730,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.164216, row0.054399,width,height); num_of_tests=8 ; } else if(info> anti_aliasing==5 ) { sum= pointInPolygon(info> mCoordinates, polySize, col+0.285561, row+0.188437,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.360176, row0.065688,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.111751, row+0.275019,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.055918, row0.215197,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.080231, row0.470965,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.138721, row+0.409168,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.384120, row+0.458500,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.454968, row+0.134088,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.179271, row0.331196,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.307049, row0.364927,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.105354, row0.010099,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.154180, row+0.021794,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.370135, row0.116425,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.451636, row0.300013,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.370610, row+0.387504,width,height); num_of_tests=15; } else if(info> anti_aliasing==6 ) { sum=

PAGE 122

! "##! pointInPolygon(info> mCoordinates, polySize, col+0.030245, row+0.136384,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.018865, row0.348867,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.350114, row0.472309,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.222181, row+0.149524,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.393670, row0.266873,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.404568, row+0.230436,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.098381, row+0.465337,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.462671, row+0.442116,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.400373, row0.212720,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.409988, row+0.263345,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.115878, row0.001981,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.348425, row0.009237,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.464016, row+0.066467,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.138674, row0.468006,width,height) + pointInPolygon( info> mCoordinates, polySize, col+0.144932, row0.022780,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.250195, row+0.150161,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.181400, row0.264219,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.196097, row0.234139,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.311082, row0.078815,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.268379, row+0.366778,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.040601, row+0.327109,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.234392, row+0.354659,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.003102, row0.154402,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.297997, row0.417965,width,height); num_of_tests=24; } else if(info> anti_aliasing==7 ) { sum= pointInPolygon(info> mCoordinates, polySize, col+0.266377, row0.218171,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.170919, row0.429368,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.047356, row0.387135,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.430063, row+0.363413,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.221638, row0.313768,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.124758, row0.197109,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.400021, row+0.482195,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.247882, row+0.152010,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.286709, row0.470214,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.426790, row+0.004977,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.361249, row0.104549,width,height) + pointInPolygon (info> mCoordinates, polySize, col0.040643, row+0.123453,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.189296, row+0.438963,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.453521, row0.299889,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.408216, row0.457699,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.328973, row0.101914,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.055540, row0.477952,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.194421, row+0.453510,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.404051, row+0.224974,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.310136, row+0.419700,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.021743, row+0.403898,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.466210, row+0.248839,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.341369, row+0.081490,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.124156, row0.016859,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.461321, row0.176661,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.013210, row+0.234401,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.174258, row0.311854,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.294061, row+0.263364,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.114836, row+0.328189,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.041206, row0.106205,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.079227, row+0.345021,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.109319, row0.242380,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.425005, row0.332397,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.009146, row+0.015098,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.339084, row0.355707,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.224596, row0.189548,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.083475, row+0.117028,width,height) +

PAGE 123

! "#$! pointInPolygon(info> mCoordinates, polySize, col+0.295962, row0.334699,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.452998, row+0.025397,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.206511, row0.104668,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.447544, row0.096004,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.108006, row0.002471,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.380810, row+0.130036,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.242440, row+0.186934,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.200363, row+0.070863,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.344844, row0.230814,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.408660, row+0.345826,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.233016, row+0.305203,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.158475, row0.430762,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.486972, row+0.139163,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.301610, row+0.009319,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.282245, row0.458671,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.482046, row+0.443890,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.121527, row+0.210223,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.477606, row0.424878 ,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.083941, row0.121440,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.345773, row+0.253779,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.234646, row+0.034549,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.394102, row0.210901,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.312571, row+0.397656,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.200906, row+0.333293,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.018703, row0.261792,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.209349, row0.065383,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.076248, row+0.478538,width,height) + pointInPolygon(info> mCoordinates, polySize, col0.073036, row0.355064,width,height) + pointInPolygon(info> mCoordinates, polySize, col+0.145087, row+0.221726, width,height); num_of_tests=66; } info> AlphaMask[row*width+col]=(255*sum)/num_of_tests; col++; } row++; } info> RecalulateMask= false; } static void ProcessVideoFrame( IsadoraParameters* ip // not used in this function, but needed to call PluginAssert_ PluginInfo* info, ImageBufferPtr srcBuf, ImageBufferPtr outBuf) { UInt32* srcData = static_cast< UInt32*>(srcBuf> mBaseAddress); UInt32 srcStride = srcBuf> mRowBytes srcBuf> m Width sizeof( UInt32); UInt32* outData = static_cast< UInt32*>(outBuf> mBaseAddress); UInt32 outStride = outBuf> mRowBytes outBuf> mWidth sizeof( UInt32); info> mCurrentWidth = outBuf> mWidth; info> mCurrentHeight = outBuf> mHeight; // ### this is where your video processing code would go // here, we are increasing or decreasing the red, green, // and blue components of the video image.

PAGE 124

! "#$! /* SInt16 redFactor = static_cast(256.0 info >mRedAmount); SInt16 greenFactor = static_ca st(256.0 info >mGreenAmount); SInt16 blueFactor = static_cast(256.0 info >mBlueAmount); */ // int polySize = 6; //this allows for all 6 verticies// int polySize = info> mVerticesCount / 2 ; if ((polySize <= 0 ) || (polySize < 3 )) return; if(info> RecalulateMask==true || outBuf> mWidth!=info> AlphaMaskWidth || outBuf> mHeight!=info> AlphaMaskHeight) MaskRecalculation(ip,info,outBuf> mWidth,outBuf> mHeight); // for each row SInt16 row = 0 ; while (row < outBuf> mHeight) { // and for each column in that row SInt16 col = 0 ; while (col < outBuf> mWidth) { // IMRORTANT: For Mac/Windows cross platform compatbility, // make sure to use the RED_, GREEN_ and BLUE_ macros to // extract pixels from a the raw data, and use RGB_ to // combine them back. // // On the Mac, the pixels are arranged 00RRGGBB with // the blue in the low 8 bits. // Under Quicktime for Windows, the data is arranged // BBGGRR00 in reverse order. Using the mac ros is // a easy way to ensure your code will operate on // both platforms. // SInt16 red = RED_(*srcData); SInt16 green = GREEN_(*srcData); SInt16 blue = BLUE_(*srcData); if (info> AlphaMask!=NULL) *(outData) = ARGB_(info> AlphaMask[row*outBuf> mWidth+col], red, green, blue); else *(outData) = ARGB_( 0 red, green, blue); // increment src and out pixel srcData++; outData++; // increment column count col++; } // skip to the next line of video, using the // stride values computed above srcData = (UInt32*)((char*) srcData + srcStride); outData = (UInt32*)((char*) outData + outStride); // increment row count row++; } if(info> mEditMode==1 ) { for( int movy=info> mEditSize + 1 ; movy mEditSize; movy++)

PAGE 125

! "#$! for( int movx=info> mEditSize + 1 ;movx mEditSize;movx++) for ( int i = 0 ; i < (info> mVerticesCount / 2 ); i++) { // to add edit dots to vertices// int x = info> mCoordinates[(i 2 )]*outBuf> mWidth/ 100.0+movx; if (x > outBuf> mWidth) continue; int y = info> mCoordinates[(i 2 ) + 1 ]*outBuf> mHeight/ 100.0+movy; if (y > outBuf> mHeight) continue; if ((x < 0 ) || (y < 0 )) continue; outData = static_cast< UInt32*>(outBuf> mBaseAddress); outData = (UInt32*) ( ( char*) outData + (( (y outBuf> mWidth sizeof( UInt32)) + (outStride y)) ) ); outData += x; if (info> mCurrentVertexIndex == i) { *(outData) = ARGB_( 255, 0 255, 0 ); } else { *(outData) = ARGB_( 255, 255, 0 0 ); } } } } // --------------------------------------------------------------------------------// ReceiveMessage // --------------------------------------------------------------------------------// Isadora broadcasts messages to its Message Receives depending on what message // they are listening to. In this case, we are listening for kWantVid eoFrameTick, // which is broadcast periodically (30 times per second.) When we receive the // message, we check to see if our video frame needs to be updated. If so, we // process the incoming video and pass the newly generated frame to the output. static void ReceiveMessage( IsadoraParameters* ip, MessageMask /* inMessageMask */, // the message that caused this ReceiveMessage PortIndex /* inIndex1 */, // for MIDI messages, the port where msg arrived. const MsgData* /* inData */, // the data associated with this message UInt32 /* inLen */, // the length of the data associated w/ message long inRefCon) // in our use, actually the pointer to ActorInfo { // Convert the refCon into the ActorInfo* that it // really is, so that we can get at our data ActorInfo* actorInfo = reinterpret_cast< ActorInfo*>(inRefCon); // get pointer to plugin info PluginInfo* info = GetPluginInfo_(actorInfo); // We use this Value struct in a few places below... Value v = { kData, nil }; // set a flag to remember if we had an output buffer before we // called UpdateImageBufferMap Boolean wasOutputBuffer = info > mImageBufferMap. mOutputBuffersValid; // ensure that the ImageBufferMap is up valid for the // current input Image Buffer UpdateImageBufferMap (ip, &info> mImageBufferMap); // use GetImageBufferPtr to get the input and output buffers

PAGE 126

! "#$! ImageBufferPtr img1 = GetImageBufferPtr(ip, &info> mImageBufferMap, 0 ); ImageBufferPtr out = GetOutputImageBufferPtr (&info> mImageBufferMap, 0 ); // if we don't have a valid output buffer if (info> mImageBufferMap. mOutputBuffersValid == false) { // if there was an output buffer preivously, we need to // send a 'nil' buffer to let other modules know that // our ouptut is now invalid if (wasOutputBuffer) { v.u data = nil; SetOutputPropertyValue_ (ip, actorInfo, kOutputVideo, &v); } // otherwise, if our mNeedsDraw flag is true, and if we have both // and input buffer and an output buffer, then we can proceed to // process the video image // // we only draw the image if the following are true: // 1) the mNeedsDraw variable is set to true (this is set in the // InputPropertyChangeValue callback above.) // 2) the input image buffer (img1) is not nil // 3) the output image buffer (out) is not nil } else if (info> mNeedsDraw && img1 != nil && out != nil) { // call EnterVideoProcessing_ so that Isadora will accumulate the // amount of time spent processing the video data this is not // requried by highly recommended so that the VPO value in the // Status Window stays accurate. UInt64 vpStart = EnterVideoProcessing_ (ip); // clear the mNeedsDraw flag info> mNeedsDraw = false; // assume for the moment that we won't draw the frame // set this value to true if we change the output Boolean drawFrame = false; // we only process 32 bit data in this plugin if (out> mBitDepth == 32) { ProcessVideoFrame(ip, info, img1, out); drawFrame = true; } // if the drawFrame flag got set, then we need to output the // new video data to our output port here. if (drawFrame) { // IMPORTANT: We have changed the data in the output buffer // so we need to increment the data change count so that // those looking at our data will know that there is new // data in the buffer out> mInfo. mDataChangeCount++; v.u data = out; // send the new video frame to the video output property SetOutputPropertyValue_ (ip, info> mActorInfoPtr, kOutputVideo, &v); } // make sure to compliment EnterVideoProcessing_ with // and ExitVideoProcessing_ call ExitVideoProcessing_ (ip, vpStart); } }

PAGE 127

APPENDIX C: Installation Photo Gallery 3D model in Maya

PAGE 128

! "#$! View of projector image directions onto installations View of Christie projector placement in REVE: Digital Worlds Institute.

PAGE 129

! "#$! Stage Left Center Installation The Tall One Far Stage Left Installation The Arch

PAGE 130

! "#$! Stage Right Installation The Mound Configuration: Closed Stage Right Installation The Mound Configuration: Open Stage R ight Installation The Mound Configuration: Throne Stage Right Installation The Mound Configuration: Split

PAGE 131

! "#"! Construction Stage Right Installation The Mound Configuration: Closed Stage Left Center Installation Stage Left Installation: The Tall One The Arch

PAGE 132

! "#$! Stage Right Installation The Mound Configuration: Open Stage Left Installations The Tall One and The Arch

PAGE 133

! "##! Mapping

PAGE 135

! "#$! Initial mapping session (closed configuration) Final screenshot of vertex points (closed configuration)

PAGE 136

! "#$! Precision mapping with value sliders (throne configuration) Final screenshot of vertex points (throne configuration)

PAGE 137

APPENDIX D: The Last Unicorn Production Photo Gallery Stage Right Installation (open configuration) Mommy Fortuna Magic Stage Left Installations Magic Heart

PAGE 138

! "#$! Stage Right Installation (closed configuration) Unicorn Forest Stage Left Installations Balcony

PAGE 139

! "#$! Stage Right Installation (throne configuration) Throne Room Full Stage Woods

PAGE 140

! "#$! Full Stage Carnival Stage Left Installations Along the Road

PAGE 141

! "#"! Full Stage Bedchamber Full Stage Bull Fire

PAGE 142

! "#$! Full Stage Unicorn Waves Full Stage Balcony

PAGE 143

! "#$! LIST OF REFERENCES Beagle, Peter S.. The Last Unicorn. London: Penguin Books, 1968, 1991. Print. Beagle, Peter S.. The Last Unicorn. San Diego, CA: IDW Pub., 2011. Graphic Novel. Beck, Julian. "Our Mission | Living Theatre." Home | Living Theatre. Web. 17 Oct. 2011. < http://www.livingtheatre.org/about/ourmission >. Bloom, Harold. Tennessee Williams's The Glass Menagerie New York: Chelsea House, 2007. Print. Borovay, Zachary. "I Am Not a Lighting Designer." Live Design (2006). livedesignonline.com Live Design. Web. 15 June 2010. . Brecht, Bertolt, and John Willett. Brecht on Theatre; the Development of an Aesthetic. New York: Hill and Wang, 1964. Print. Bursill, Henry. "Shadow Puppets: A Pupet Shadow Comparison." Shadow Puppets: A Series of Novel and Amusing Figures Formed by the Hand Shadow-Puppets.com, 2006. Web. 1 Oct. 2011. . "Christie Projectors: Christie DHD670-E DLP Projector." Projectors, Projector Reviews, LCD Projectors, Home Theater DLP Projectors at ProjectorCentral.com Web. 25 July 2011. . Dawson, Gary Fisher. Documentary Theatre in the United States: An Historical Survey and Analysis of Its Content, Form, and Stagecraft Westport, CT: Greenwood, 1999. Print. Dodson, Bryan, and Michelle Dodson. "Projection Mapping Introduction." Web log post. Video Mapping Blog. Integrated Visions Productions, 2010. Web. 12 Sept. 2011. . Esaak, Shelley. "Dada Art History Basics on the Dada Movement 1916-1923." Art History Resources for Student s, Enthusiasts, Artists and Educators Artist Biographies Art Timelines Images and Picture Galleries. Web. 29 Oct. 2011. . Finley, Darrel Rex. "Point -In-Polygon Algorithm Determining Whether A Point Is Inside A Complex Polygon." Alien Ryder Flex: The WWW Homepage of Darrel Rex Finley. 2007. Web. 30 May 2011. .

PAGE 144

! "##! Hopkins, David. Dada and Surrealism a Very Short Introduction. London: Oxford University Press, 2004. Print. Innes, C. D. Erwin Piscator's Political Theatre: the Development of Modern German Drama. Cambridge: Cambridge UP, 1972. Print. "Jitter.h." Opengl.org. Silicon Graphics, Inc. Web. 07 Sept. 2011. . Kramer, Richard E. ""The Sculpt ural Drama": Tennessee Williams's Plastic Theatre." The Tennessee Williams Annual Review. 2001. Web. 10 Sept. 2011. . Leverich, Lyle. Tom: The Unknown Tennessee Williams New York: Norton, 1995. Print. LoBiondo, Maria. "Emily Mann, Artistic Director of McCarter Theatre." Princeton Online Web. 07 Oct. 2011. . Piscator, Erwin. "The Berlin Production of Paquet's Flags." Essays on German Theater Comp. Margaret Herzfeld-Sander. New York: Continuum, 1985. 182-85. Print. Probst, Gerhard F. Erwin Piscator and the American Theatre. New York: P. Lang, 1991. Print. Savran, David. The Wooster Group, 1975-1985: Breaking the Rules Ann Arbor, Michigan: UMI Research Press, 1986. Print. Szanto, George H. "Information, Distortion, Propaganda: Control Factors in Technological Societies." Theater & Propaganda. Austin: University of Texas, 1978. 23+. Print. The Builders Association. Web. 17 Oct. 2011. . The Last Unicorn. Dir. Jules Bass and Arthur Rankin Jr. Perf. Mia Farrow, Jeff Bridges and Angela Lansbury. Jensen Farley Pictures, 1982. Videocassette. Tytell, John. The Living Theatre: Art, Exile, and Outrage. New York: Grove, 1995. Print.

PAGE 145

! "#$! PROJECTION DESIGNERS PERSONAL WEBSITES AND COMPANY WEBSITES 1024 1024 Architecture / Creative Label / Art Installation / Video Mapping / Ex EXYZT. Web. 14 Sept. 2011. . Peter Flaherty. The Four / / The Five. Web. 20 Sept. 2011. . V Squared Labs. Web. 15 Aug. 2011. . Wendall Harrington. Web. 13 July 2011. < http://www.wendallharrington.com>. Zachary Borovay Projection Designer Web. 05 June 2011. . PROJECTION SOFTWARE AND OTHER RELATED WEBSITES Ableton Homepage. Web. 13 Aug. 2011. . Cycling 74. Web. 17 Oct. 2011. . Dataton Watchout. Web. 10 Oct. 2011. . "Figure 53 | QLab | Live Show Control for Mac OS X." Figure 53 | QCart | Audio Cart for Mac OS X. Web. 17 Oct. 2011. . FreeFrame Open Realtime Video Effects Web. 20 Oct. 2011. . Green Hippo The Worlds Best Solution for Realtime Video Playback on Events Worldwide Home. Web. 17 Oct. 2011. . "Iduun Releases the Much Anticipated MapMapMap Module Modul8 Blog." Modul8 Blog. Web. 1 7 Oct. 2011. . MadMapper | The Video Mapping Software Web. 16 Oct. 2011. . Modul8 VJ Software. Web. 19 Oct. 2011. . Quartz Composer. Web. 17 Oct. 2011. . "Resolume VJ Software Resolume Avenue 3 Features." Resolume VJ Software Live Digital Motion Graphics. Web. 17 Oct. 2011. .

PAGE 146

! "#$! Syphon. Web. 15 Oct. 2011. . "TroikaTronix Isadora." TroikaTronix Home. Web. 03 June 2011. . "VPT 6.0 Conversations with Spaces." Conversations with Spaces Web. 04 Sept. 2011. . WIKIPEDIA SOURCES "Incandescent Light Bulb." Wikipedia, the Free Encyclopedia. Web. 20 Oct. 2011. . "Magic Lantern." Wikipedia, the Free Encyclopedia. Web. 7 Oct. 2011. . "MIDI." Wikipedia, the Free Encyclopedia. Web. 16 Aug. 2011. . "Moir Pattern." Wikipedia, the Free Encyclopedia. Web. 20 Oct. 2011. . "Spatial Anti-aliasing." Wikipedia, the Free Encyclopedia. Web. 18 Aug. 2011. . "Supersampling." Wikipedia, the Free Encyclopedia. Web. 20 Sept. 2011. . PHOTOGRAPHY CREDITS "Moir on Parrot Feathers.jpg." Wikipedia, the Free Encyclopedia. Web. 20 Oct.2011. . Photograph licensed under the GNU Free Documentation License at . Projection software graphical users interface images attributed to the respective companies. Further information can be found at the individual companies websites found in the Projection Software Websites section of the List of References. Images of the installations provided by author, Brittany Powell.

PAGE 147

! "#$! %&'()*+,&-*.!/012-,! %3455678!+9:;<937!97!?9@;A>;3!BC!"BDE!47!F6GH=97@4<<;C!I<934J6K!/L;! 4=!5L;!89M7N;3!9O!5:9!J6MNL5;3=!67J!N36JM65;J!O39A!P67J6347!,4NL!/GL9934JN;!V74@;3=458Z=!&75;37654976;3!QRRB!67J!>;N67!N36JM65;!=GL99M5!L;3!O43=5!=;A;=5;3!599H!5L3;;!G9M3=;=!65!5L;![4N456347N47N!5L;=;!5;GL79<9N4;=!59!6<


PAGE 1

! NON RECTILINEAR PROJECTION DESIGN FOR LIVE CUE ABLE THEATRICAL PERFORMANCE By BRITTANY POWELL SUPERVISORY COMMITTEE: ANGELOS BARMPOUTIS CHAIR TIZA GARLAND MEMBER PATRICK PAGANO MEMBER A PROJECT IN LIEU OF THESIS PRESENTED TO THE COLLEGE OF FI NE ARTS OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS UNIVERSITY OF FLORIDA 2011

PAGE 2

! # 2011 Brittany Powell

PAGE 3

! $ To Donna for being selfish, to Lear for br inging it full circle, to Tim for being the balance

PAGE 4

! % ACKNOWLEDGEMENTS I thank the chair and members of my supervisory committee for their guidance and passion the Digital Worlds Institute and the School of Theatre and Dance for their generous support, the design team of The Last Unicorn for their long hours and brilliant ideas, Anton Yudin for his altruism, and my friends and family for the ir constant encouragement, support and patience.

PAGE 5

! & TABLE OF CONTENTS Dedication Page 3 Acknowledgments Page 4 Abstract 8 Introduction 10 CHAPTER 1. History of Projections in American Theatre 11 1.1. Agit prop and Documentary Theatre 11 1.2. Documentary Theatre Re defined 1 2 1.3. Piscator's Formative Years 1 3 1.4. The New School 1 5 1.5 Tennessee Williams 1 6 1.6 The A rtist C ollectives 1 6 1.7 Bertolt Brecht 1 8 2 Research Process 1 9 2.1 Current State of the Industry 1 9 2.2 Software Overview 21 2.2.1 Software comparison chart 2 3 2.2.2 Isadora 2 4 2.2.3 MadMapper 2 5 2.2.4 Modul8 with free MapMapMap s oftware 2 6 2.2.5 Resolume 2 6 2.2.6 Video Projection Tool (VPT) 2 7 2.2.7 Pire Data (PD) 2 7 2.2.8 MAX/MSP/Jitter 2 8 2.2.9 Quartz Composer 2 8 2.2.10 QLab 2 9 2.2.11 Watchout 2 9 2.2.12 Green Hippo 30 3 Development Process 30 3.1 Isadora with Proposed Mapping Software 3 1

PAGE 6

! 3.2 Characteristics for Comparison 3 1 3.2.1 Dynamic mapping 3 1 3.2.2 Map manipulation 3 4 3.2.3 Media manipulation 3 5 3.2.4 Concaved maps 3 5 3.2.5 Anti aliasing 3 5 3.2.6 Adding effects 3 6 3.3 Polygon Mapper Design 3 6 3.3.1 Video input 3 7 3.3.2 Mouse X and mouse Y 3 7 3.3.3 Key code 3 9 3.3.4 Anti aliasing application 40 3.3.4.1 Anti aliasing defined 40 3.3.4.2 Supersampling 4 2 3.3.4.3 Jitter sampling 44 3.3.4.4 Jitter points source code 4 5 3.3.4.5 Anti aliasing within the Polygon Mapper 4 6 3.3.5 F value 4 9 3.3.6 Edit mode 4 9 3.3.7 Dot size 50 3.3.8 Vertex count 5 1 3.3.8.1 Adding and subtracting vertices 5 3 3.3.9 Video output 5 5 3.3.10 Animated manipulation effects 5 6 3.3.11 Me dia manipulation 5 7 3.3.12 Mapping concaved polygons 5 8 3.3.13 Adding effect s 6 1 3.4 Cueing with Isadora 6 1 3.4.1 MIDI control 6 3 3.5 Trouble shooting 6 3 3.5.1 Tracking edit dots 6 4

PAGE 7

! ( 3.5.2 Resolution issues 6 4 3.5.3 Polygon Mapper unlinking bu g 6 7 3.5.4 Isadora issues with show control computer 6 8 3.6 Conclusion of Development Process 6 9 4 The Last Unicorn: A Performance Within a Project 6 9 4.1 Installations 70 4.2 Projectors, Sight Line, and Shadows 7 3 4.3 Projection Artwork 7 5 4.4 Installation Artwork 7 6 Conclusion 7 7 Footnotes 7 7 APPENDIX A. Manual 80 B. Source Code 104 C. Installation Photo Gallery 127 F The Last Unicorn Production Photo Gallery 13 7 REFERENCE LIST 14 3 BIOGRAPHICAL SKETCH 14 7

PAGE 8

! ) A bstract of Project in Lieu of Thesis P resented to the College of Fine Arts of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Arts NON RECTILINEAR PROJ ECTION DESIGN FOR LIVE CUE ABLE THEATRICAL PERFORMANCE By Brittany Powell December 2011 Chair: Angelos Barmpoutis Major: Digital Arts and Sciences The goal of this project in lieu of thesis is the research and development of a digital projection mappin g tool that is specifically designed for live theatrical performances and that is accessible, easily controllable, and cue able. Working inside of the existing projection software, Isadora, the author construct ed the Polygon Mapper which gives the user c omplete control over mapping non rectilinear objects while utilizing Isadora specific functionality to manipulate and cue performances. This is unique to the theatrical field in that other mapping software lacks complete control, cue ability, or accessi bility. All three of these components must be present to achieve optimal functionality that can be exploited in any theatre, for any show, on any budget. This paper will review the history of projections in theatre, outline the research and development pr ocess for creating and manipulating the software, and provide a detailed manual for mapping using the software designed inside of Isadora. This Masters Thesis proposal has been accepted for presentation at the 2012 Southeastern Theatre

PAGE 9

! Conference in March in Chattanooga, Tennessee under the presentation title name: "Projection Design: Video Mapping onto Non Traditional Surfaces."

PAGE 10

! "+ Introduction Projections came to b e in theatre for two reasons: a dida ctic awareness and a spectacle performance Brought to the United States by way of Documentary Theatre, Erwin Piscator pioneered new stage techniques with projections in an effort to force social awareness on the audience Tennessee Williams, working against the philosophy, felt the theatre would be better served to use projections as a visualization apparatus of the human condition. Both viewpoints took hold in the American theatre industry and spread through out the country and the generations. Today, p rojections in theatre serve one main purpose : to co mment on the actor and/ or action. Take for example, The Sound of Music, the Von Trapp family sings and dances in front of a projection of the mountain ranges of Austria. The projection is setting a location for the action and drawing the audience into t he world of the play. Another example, The Laramie Project, the true story of Matthew Shepard who was brutally beaten to death for being homosexual, during the trial of the accused murders, anti gay protest images are displayed on the screen to show the h arsh reality of hate in America Each is an example of projections in use based on Piscator's and Williams' ideas. T he sentiment is different, but the overreaching theme of maki ng a statement about what or who is on stage is analogous The rapid a dvance of technology offers even more projection possibilities than ever. Projection mapping is a projection technique that transforms almost any 2D or 3D surface into a video display. Projection mapping is most often used in architectural installation an d in live music shows when projections are the focal point. This technique opens up an entirely new realm of theatre yet it has limited reach into the purely theatre indust ry as of yet.

PAGE 11

! "" The proceeding chapters of this paper discuss the history developm ent and practical application of a technology that can combine both theatrical necessities and designer freedom. Chapter 1 History of Projections in American Theatre The projection industry has come a long way since the time of the Chinese shadow puppet s of the Han Dynasty 1 and the invention of the magic lantern in the 17 th century. 2 Its place in American theatre today was carved out by a series of even ts that shaped the artists and, by extension, the art of the time. 1.1 Agit prop and Documentary T heatre The starting point for projections in theatre begins with Documentary Theatre. Documentary Theatre was precipitated by agit prop (agitation propaganda) theatre defined as an attempt "to bring citizens to the proper point of radical awareness so th at they can afterwards be moved to voluntary action." 3 Agit prop theatre contrary to a form of pure entertainment, was a means of educating and communicating news to the illiterate populace beginning in the U.S.S.R. after the Revolution of 1917 It is a nalogous to a "town crier" and is characterized by spontaneous, unrehearsed troupes performing in public venues with no stage, costumes, props, etc. attempting to arouse support for social issues of the time. 4 Agit prop theatre is sometimes referred to as Political Theatre; however, the former is characterized by impromptu performances, where the latter is a more structured endeavor. Georg B Ÿchner, a young German playwright, wrote Danton's Death in 1835 about the French R evolution. 5 This play is consid ered the first building block in the foundation of D ocumentary T heatre, which would not receive lasting world wide impact until the 1920's by Erwin Piscator, who would later stage a production of Danton's Death in 1956 using projections 6

PAGE 12

! "# Documentary T hea tre can be defined simply i n an interview with Emily Mann 7 a well known D ocumentary T heatre director and playwright, by author Gary Fisher Dawson. She explains, I usually ask [people] if they if they have seen any documentary films. Almost everyone has I say, well that's what I do. I go out and find the event. I go to the place. I do a lot of work on it. I do a lot of research on it. I interview a whole lot of people. I find documents that have to do with that. Then I construct a play out of th at. I'm working from life and it's very personal. 8 1.2 Documentary Theatre Re defined The advent of projections in theatre in the 1920s was introduced in a new form of Documentary Theatre referred to as Epic Theatre. Director and teacher, Erwin Piscator is considered the foremost practitioner of this genre of th eatre. In Dawson's book, Documentary Theatre in the United States: An Historical Survey and Analysis of its Content, Form, and Stagecraft, the author illustrates the characteristics of Epic T heatre. Epic Theatre 9 Through line of action Focus is on the historical documented background of the event using montage and juxtaposition in place of exposition. Charac ter Two dimensional Factually based within a social context. Societal dynamic Past comments on the present. Assumes critical stance towards society by distancing the audience. Desires to transform society. Mise en sc ne Total Theatre Acting Style Pres entational Objective acting

PAGE 13

! "$ Piscator explained that Epic Theatre "was about the extension of the action and the clarification of the background to the action, that is to say it involved a continuation of the play beyond the dramatic framework." 10 Epic Theatre was about the disillusionment of the audie nce, which forced them to confront the harsh reality of the social climate of the time. No longer was theatre to be an escape hinging on the audience's willing suspension of disbelief. "A didactic play was developed from a spectacle play." 11 As social iss ues of the time made thei r way onto the stage, theatre techniques had to evolve to keep up. Instead of simply talking about the post Nazi era on stage, photographs and newspaper clippings could be projected onto the stage, sound clips and loud speakers co uld bombard the audience, advanced staging machinery could link ed the inside of the theatre with the booming Machinery Age going on outside. All of this was done to give the audience a more realistic sense of time, place, and call to action. This is the p remise of Epic Theatre and the idea of a Total Theatre, characterized by the use of all technical aspects in the theatre: projections, light, sound, machinery, staging, costumes, props, etc. Piscator brought all of this to the American stage. 1.3 Pisc ator's Formative Years In 1915, Piscator, a German national, was conscripted into the First World War where he served 2 years at the front lines 12 During the war, he was commanded to work with the "army acting groups which entertained the troops with pop ular comedy, crudely done." 13 After the war, which Piscator blame d on capitalism, he joined the C ommunist party. 14 His exposure to the violence of the war and the asinine theatre meant to block out the reality of the soilders' situation shaped his repudia tion of art for the sake of entertainment and he "rejected all art which had no relevance to the real conditions of life." 15 However, after returning to Berlin in early 1919 after the end of the war, he was introduced to the Dada movement by a friend. Da da was essentially an "international artistic phenomenon, which sought to overturn the traditional bourgeois notion of art." 16 Established in Zurich in 1916, Dada

PAGE 14

! "% developed from a combination of factors that altered the world's mindset the First World Wa r, the French Revolution, technological advances in the Machine Age, Freud. 17 Dada art had no rules or even guidelines. Art became a reflection of your inner self and was created to provoke and offend traditionalists. 18 One of the most notable pieces cre ated during the Dada movement, or non movement as it was sometimes called was created by Marcel Duchamp painting a mustache and writing obscenities on a copy of the Mona Lisa. 19 The non movement reached Germany in 1918 and due to the post war, politically charged social and economical climate, "Berlin Dadists tended to be highly politicized." 20 The Dada movement ended in the early 1920s and gave rise to S urrealism in 1924, which boasts artist s such as Salvador Dal ’, Joan Mir— and RenŽ Magritte. Well know Dadaists include Tristan Tzara and Hannah Hoch. Piscator's involvement with the Dada movement was brief and ultimately reconfirmed his earlier principles that art must be based on current events and present a call to action to the audience. Dada did not produce the audience response Piscator sought. Despite his abandonment of the overall purpose of the movement, Piscator 's works were heavily i nfluenced by the theatrical techniques employed by artists of the movement. Ywan Goll, a foremost Expressionist of the time mostly accredited for his poetry, first introduced projections into his productions of The Chaplinade and The Immortal in 1920, 21 fo ur years before Piscator would implement the same effect in his production of Flags. Piscator used "projections of stills and of motion pictures" in his staging of Alfons Paquet's Flags (Fahnen) at the Volksb Ÿhne in Berlin in 1924. 22 This is considered by most scholars to be the revolution of Documentary Theatre and technical theatre. This was a shift that earned Piscator the reputation of bringing Epic Theatre and, thus, projections into mainstream theatre as we know it today. While Goll first brought projections to the stage, Piscator believed he gave them true meaning He claimed "[ Flags ] was the first time to my knowledge that slide projection had been used in this fashion." 23 While projections are the most notable influence from Goll and the Dada

PAGE 15

! "& m ovement, other techniques were implemented into Piscator's productions. 24 Goll used these techniques to allow the audience to transcend real life into the imagination, while Piscator used the techniques to confront the audience with real life. 25 After the production of Flags, only one of Piscator's productions, What Price Glory?, would be performed without the use of projections. 26 1.4 The New School Leaving Europe in 1938, Piscator established himself in the United States by founding the Dramatic Worksh op at the New School for Social Research in New York. Teaching courses on acting, directing, and playwriting, Piscator influenced an entire generation of artists including Tennessee Williams, Arthur Miller Marlon Brando Bea Arthur Judith Malina (co fou nder of the Living Theatre) 27 During his time in the United States, Piscator further developed his technique of Epic T heatre and t he Dramatic Workshop is largely responsible for the mobilization of the Off Broadway movement, which provided a venue for ex perimental theatre. Piscator fled to former West Germany after being subpoenaed by the House Un American Activities Committee (HUAC) in 1951 He never returned to theater in the United States 28 Yet, he would have a lasting impression on both performance and technical theatre. After Piscator's abrupt departure from the Dramatic Workshop his wife, Maria Ley Piscator became the director of the Dramatic Worksh op and subsequent organizations. 29 Piscator's most notable contribution s to the America theatre we re his unique and experimental staging techniques, which can be seen in the Off and Off Off Broadway theatres. 30 The students Piscator educated while at the New School would go on to be come some of the most influential artist in American theatre taking wit h them a sense of lasting association with Epic T heatre techniques

PAGE 16

! "' 1.5 Tennessee Williams I n a letter to his mother, Williams described Piscator as "a terribly dictatorial German, completely impractical" and "to comply with his demands will destro y the poetic quality of the play." 31 Although Williams loathed Piscator's teaching practices and ideals on the nature of art, he still highly respected his theatrical techniques. 32 Williams coined the term "plastic theatre" in his production notes for The Glass Menagerie referring to the use of projections, music, and lighting as "expressionistic tools, not in attempt to avoid reality, but rather to approach experience more closely." 33 In contrast to Piscator's use of staging techniques, Williams wanted the sentiment of the play to be manifested in a tangible fashion. 34 Specifically, projections were meant to be used to emphasis certain scenes that held significance in the plot. The Glass Menagerie lends itself to this type of plastic thea tre because it is a memory play giving the audience a chance to experience the emotional journey of the characters. The majority of Williams' plays allude to the use of Total Theatre It is through Williams that projections in dramatic plays versus epic plays were introduce d into mainstream theatre. 1.6 The Artist C ollectives The Living Theatre f ounded by Judith Malina and Julian Beck was the first wave of experimental theatre that came out of Piscator's theatrical staging ideas and the Off Broadway movement in the 19 60s. 35 To this day, the Living Theatre is the oldest experimental theatre still in existence. Total T heatre was the instrument for this transformative theatre. Judith Malina comment ed on Piscator's staging techniques remarking that "Piscator's innovative breakthroughs are essential I think that modern theatre couldn't be what it is unless Piscator had done what he did his concept of Total Theatre was the use of everything we have and know now he would be experimenting with laser beams and holograms in the theatre." 36 Concepts from Piscator's definition of Epic Theatre are seen in the Living Theatre's production s and are evident in their mission stat e ment as written by Julian Beck :

PAGE 17

! "( T o call into question who we are to each other in the social environment o f the theater to undo the knots that lead to misery, to spread ourselves across the public's table like platters at a banquet, to set ourselves in motion like a vortex that pulls the spectator into action, to fire the body's secret engines, to pass thro ugh the prism and come out a rainbow, to insist that what happens in the jails matters, to cry Not in my name! at the hour of execution, to move from the theater to the street and from the street to the theater. This is what The Living Theatre does today It is what it has always done. 37 Other groups similar to the Living Theatre were developed in later generations. The anti Vietnam War movement sparked another revitalization of the experimental theatres by way of Off Off Broadway in the early 19 70s. Th e Wooster Group was among the colle ctives that emerged at the time. Distinctive for its "combination of aesthetic and political radicalism with intellectual rigor," 38 nearly all of the Wooster Group's productions have had a video or projection element since it s creation in 1975. Similar to Wooster, t he Builders Association, established in 1994 by Marianne Weems produc ed shows that blend stage performance, text, video, sound, and architecture to tell stories about human experience in the 21st century." 39 S everal other collectives in the United States and abroad utilize the techniques Piscator made relevant with his ideas of Epic and Total Theatre. However, he is often not given the credit for such ideas. Perhaps this has to do with his strict Communist de meanor and lack of respect for actors and playwrights. It is also possible that as a director and producer he did not have a widespread reach beyond his direct audience, as a playwright would.

PAGE 18

! ") 1.7 Bertolt Brecht There is a debate among scholars as to who should be the credited as the inventor of Epic Theatre: Erwin Piscator or his contemporary, Bertolt Brecht. Brecht was a German playwright and director working in Berlin at the same time as Piscator. Brecht is "undeniably the most important playw right to have emerged in Germany since the First World War." 40 As far as the stage techniques generated from Epic Theatre, Brecht's work illustrates that he not only appropriated Piscator's ideas but he "exploited [Piscator's] distinctive techniques." 41 S everal months after Piscator's production of Flags was staged with the use of projections, Brecht wrote Edward II employing the same use of slide projections and title cards. In the years to come there was a multitude of instances where Brecht pilfered P iscator's unique style. 42 Brecht himself asserted that Piscator, who without a doubt is one of the most important theatre men of all times, began to transform [the stage's] scenic potentialities. He introduced a number of far reaching innovations. One of them was his use of the film and of film projections as an integral part of the setting. 43 Brecht's own political theatre style would not be seen until his production of Man is Man in 1931, two years after Piscator articulated his idiosyncratic stage techn iques in his published work, The Political Theatre ( Das Politische Theatre ) in 1929. 44 Whether scholars believe it be Brecht or, indeed, Piscator who brought projections into the theatre, one thing that can be agreed upon is that it is here. Projections a re not solely in theatre either They have or are becoming commonplace in almost all areas of live performance and art

PAGE 19

! "* Chapter 2. Research Process 2.1 Current State of the Industry A general analysis of the type of projections that are in the entertainment industry reveal two major categ ories: background projections and installation projections. These genres are most preva lent in theatre and music or architecture Methods for projection design vary because the focus of the audience changes. Background projections are most commonly seen in theatres because the focus remains on the actors and the action. A physical locale or abstract image is projected onto the screen with an artist edge that helps to further set the atmosphere or tone of th e play. It is a very effective technique and has become increasingly more popular in mainstream theatre. Projections are used as backdrops because in dramatic theatre the most important element is the actor. The actor is responsible for conveying the st ory with the aid of projections, not the other way around. With projections in the background or at the most built around the actor, there is very little to distract or obstruct the view f or the audience. The Broadway world of projection designers is small, at best. There are a handful of designers such as Zachary Borovay 45 ( Lombardi, Xanadu ), Peter Flaherty 46 ( Sondheim on Sondheim ), Wendall Harrington 47 ( Having Our Say, Ragtime, Tommy ) who have set the industry standard for how projections are utilized in theatres. The most common tool for projection design in theatre in Watchout, a video and image playback system that lets the designer place media and add limited effects. This is an ideal program for cueing a show, but with a price tag well over $10,0 00 for multi screen projections Watchout is mostly left on the Broadway stage. For most projection installation performances, there is a look but don't touch policy. These installations can be a large or small scale gallery type event, where the audienc e watches the performan ce unfold onto a unique medium using projection

PAGE 20

! #+ mapping tools, "s pecialized software has been designed to warp and mask the projected image to make it fit perfectly on irregularly shaped screens or objects." 48 Traditionally, rectili near projections are rectangular image s produced by a projector. The screens for these projections are typically mounted to a wall and replicat e the shape of the projector 's image In the past the projector 's image dimensions dictate d the screen size and shape Limiting the output two a four sided box. Irregularly shaped screens or objects are considered to be non rectilinear. The premise behind non rectilinear projection mapping is that the projector no longer dictates the screen, but rather the other way around. The projected image takes the shape of its surface, regardless of the form. Having a 3 dimensional cube as the surface for projections is considered to be non rectilinear because the image output is not confined to the traditional four sides A figure with 30 to 40 sides create a complex polygon, non rectilinear projection map as seen in the example below from the Polygon Mapper actor created by the author in Isadora as the subject of this project in lieu of the thesis. Curved surfaces are also considered non rectilinear. By using a large number of vertex points, curves can be mapped to outline the form of nearly any object. The projection map pictured to the right was created with the Polygon Mapper actor in Isadora and has 110 ind ividual vertex points that create the shape of the object.

PAGE 21

! #" Impressive examples of non rectilinear projection mapping are seen in various venues across the world. Companies such as V Squared Labs 49 and 1024 Architecture 50 create astonishing imagery with projections for use in concerts and marketing endeavors Disney World has a spectacular new show called, "Magic, Memories, and You" that projection maps onto Cinderella's castle in Orlando, Florida. The audience stands back and watches as 16 different proj ectors illuminate the castle with vines growing up the turrets one moment and the next it bursts into flames. Installations like this sometimes have themes or even a story line but overall the purpose is to give the audience a spectacular visual displa y. The focus is on the projected images and not on the actors, which is most likely why you rarely see actors in these types of projection installations. Programs such as MadMapper were created specifically for pr ojection mapping and have been integrate d heavily into projection software programs for VJs. A VJ is a the disc jockey of video and the live mixing component of their work makes the programs they use less than ideal for a live theatrical performance where the actors and technicians are relying on precision timing and replicable cues. The intent of this project in lieu of thesis is to combine the cue ability of Broadway productions with the wow factor of non rectilinear projection mapping and make them affordable and accessible to any theat re on any budget. 2.2 Software Overview The existing projection mapping software is robust as a whole. However, individually the different applications fall short of creating a complete theatrical package. The goal of my thesis project was to enhance the existing projection software Isadora to

PAGE 22

! ## create a comprehensive program that meets the needs of theatres across the spectrum of experience and budget. This current software review will highlight the benefits of my design of the Polygon Mapper actor within Isadora in comparison to other applications available

PAGE 23

! #$ 2.2.1. Software Comparisons* Projection mapping is a new and quickly evolving technology. Pricing also prohibits collection of information from some sour ces. The software comparisons are accurate as the available d ocumentation and resources at the time of authorship. **SDK is a software development kit allowing programmers to extend the program beyond its original scope. Isadora 64 & Polygon Mapper Actor 65 Isadora MadMapper 66 Modul8 67 & MapMapMap 68 Resolume 69 VPT 70 Pure Data 71 Max/ MSP/ Jitter 72 Quartz Composer (QC) 73 QLab 74 Watchout 75 Green Hippo 76 Under $400 $350 $350 $411 *Must use with other software $411 MapMapMap Free Plugin $411 Free Free $399 Free Limited: Free. Rentals for $3/day $4498 license for 1 display Unknown Hardware must be purchased Free Demo Free Upgrades (for new versions) N/A $50 for Audio/Video upgrade $50 for Audio/ Video upgrade Only 1 version as of 10/15/ 11 $49 $164 $550 Free Free $250 Free $599 A/V/Midi Bundle Version 2 $769 Unknown Education Discount N/A N/A N/A $275 $275 $274 $274 $207 $110 $449 Cue able User friendly Interface Unknown Dynamic Mapping Unlimited Vertices 3 or 4 3 or 4 4 4 4, 8,16 N/A N/A 4 N/A N/A N/A Unlimited Maps (per scene) 10 3 32 N/A N/A N/A N/A N/A Perspective Mapping (warping) Limited: Mac On ly with plugins & 4 vertices Limited: Mac Only with plugins N/A N/A N/A N/A N/A Concaved Maps N/A N/A N/A N/A N/A Map Manipulation N/A N/A Unknown N/A N/A N/A Media Manipulation Limited N/A N /A N/A N/A N/A Anti Aliasing Optimized & Adjustable Pre rendered Not Optimized N/A N/A N/A N/A N/A Add effects via QC Limited Cross Platform (Mac/ PC) Mac Mac Mac Mac Preview Mode Unknown Multi screen Midi Live Video Input SDK** Record and export stages Pics only Unknown No Proprietary Hardware

PAGE 24

! #% 2.2.2. Isadora "Isadora is the award winning, graphic programming environment for Macintos h and Windows that provides interactive control over digital media, with special emphasis on the real time manipulation of digital video. Since every performance or installation is unique, Isadora was designed not to be a "plug and play" program, but inst ead to offer building blocks that can be linked together in nearly unlimited ways, allowing you to follow your artistic impulse." 77 In Isadora, the "blocks are referred to as actors and they each have a different functionality. This can be to display med ia onto the screen (Projector actor), to switch something on and off (Toggle actor), or to map an object (Polygon Mapper actor). Isadora has several unique functions, which guided my decision to expand its mapping abilities with the creation of the Polyg on Mapper actor The most important feature is cueing. In theatre, or any live show for that matter, there are typically multiple acts and scenes. Each scene often requires a slightly or completely different mood or setting, conveyed with images or vid eos. Thus, it is important to have a method for cycling through those different scenes effectively.

PAGE 25

! #& 2.2.3. MadMapper MadMapper is strictly a mapping application that was designed in Mac OSX to share video s with other applications It is not designed as a show control system, therefore it cannot be cued, have multiple outputs to different projectors, or add effects to videos. Its only function is mapping. Its graphical user interface is simple, clean, and easy to understand. Yet, MadMapper cannot b e used on its own to control the projections for live performance. Another software must be the main control. For example, a video can be mapped in MadMapper then sent to another program such as Isadora or Modul8 via a free third party program called Syp hon 78 The projection map is "siphoned" from MadMapper into Isadora and controlled in the latter. Therefore, in order to use MadMapper in a cued performance the user must purchase both MadMapper and the control program. MadMapper is also not cross platfo rm limiting its use to only Mac users. MadMapper can be siphoned into and out of Isadora, Resolume and VPT

PAGE 26

! #' 2.2.4. Modul8 with free MapMapMap software Modul8 is a video mixing and compositing program available only on Mac OSX th at was designed for VJ s video jockey who mixes videos in real time typically to music in concerts, night clubs, and music festivals. The mixing happens in real time by adding effects, cross fading, and manipulating videos. While this art is a type of pro jection design, it is not ideal for theatres since the video performance typically changes every night. In addition, the graphical user interface is crowded and difficult to decipher for a novice user. The MapMapMap plugin for Modul8 doe s have a user frie ndly GUI but the mapping is limited to only 10 maps each with only 4 vertices 2.2.5. Resolume Resolume is another popular VJ software that has limited mapping ability with only 3 or 4 vertices. Resolume, similar to Modul8, cannot be cued for theatrical purposes and lends itself to real time video mixing. Unlike Modul8 it is cross platform and has a mostly intuitive user interface.

PAGE 27

! #( 2.2.6. Video Projection Tool ( VPT ) VPT is a free projection tool for both Mac and Windows that does have mapping capabilities limited to 16 vertices and 32 maps. The mapping function is most similar to my proposed Polygon Mapper actor for Isadora since it can create concaved maps with no warping of the image. It also works with Syphon but not MadMapper. It has very limited cueing capabilities and a complicated user interface. Minimal effects can be added to the media but it cannot be animated or manipulated the way the Polygon Mapper can. At present, VPT is unstable on Mac OSX and has a tendency to freeze or crash. 2.2.7. Pure Data (PD) PD has a construction similar to Isadora. Basic functional blocks are connected together with links to create actions It has a minimal ist user interface that is difficult to navigate and u nderstand. The vocabulary for understanding PD is extensive and presents novice users with an exponential learning curve. Pure data is cross platform and compatible with Syphon. However, Pure Data's biggest downfall is its complexity and a lack of inherent cueing capab ility.

PAGE 28

! #) 2.2.8. MAX/MSP/Jitter Max/MSP/Jitter has a construction and vocabulary similar to Pure Data, which is difficult to follow. The interface looks sleeker, yet PD offers more user information if you know where to look for it. Max/M SP /Jitter can be used in combination with MadMapper for video mapping, but still lacks the ability to be cued for theatrical performance and lends itself more to live video mixi ng, audio creation and 3D rendering 2.2.9. Quartz Composer Quartz Composer is a very powerful Mac based software created by Apple and comes standard on all new Mac computers. It has functionality similar to Isadora's and PD's with building blocks to create actions. It has amazing graphic capabilities and a quad mapper, which is l imited to 4 vertices. The program is very difficult to understand and there are layers of information needed to understand how to build a patch from scratch The program itself is not able to be cued, but can be cued through QLab It is only available fo r Mac.

PAGE 29

! #* 2.2.10. QLab QLab is a Mac based video and audio playback software. It allows for very limited manipulation of visual media and works in conjunction with Quartz Composer. It is a very effective tool for cue ing a performance but has no m apping functionality. The program is free for a limited version and $600 for the full version. 2.2.11. Watchout Watchout is software used on Broadway and other large live performance venues that allows the user to send audio, video and gra phics to multiple displays Watchout does t he exact same thing Isadora does for 50 times the cost. In order to project onto 6 different screens (the maximum Isadora can allows ), you must purchase 7 licenses at $2250 each one for the control computer and 6 for the display computers Isadora can do the same under one license for $350. Watchout is a n elegant program, but the price tag leaves it far out of reach from most theatre venues. Watchout does have geometry correction capabilities for the projector s but is not able to dynamically map complex shapes.

PAGE 30

! $+ 2.2.12. Green Hippo Green Hippo is a highly specialized proprietary media server and software designed for tim eline playback used in live music concerts, on Broadway, and other high profile live events. Creation of content is done in outside editing software, such as Final Cut Pro and After Effect. Media is composited in Green Hippo but not constructed. There are only a limited number of people in the world who are considered experts with Hipp o Media servers. Currently, in North America there are 29 Hippo Experts. 79 Chapter 3. Development Process Researching other projection software revealed several characteristics that should be integrated into the software extension of Isadora. Thr ee major aspects were necessary for my project to be a success. The mapping software had to be accessible to the mass public. Programs such as Watchout and Green Hippo are so highly specialized that the majority of theatres and university theatre program s simply cannot afford the technology. Software also had to be cue able This is imperative in the theatre to account for the different scenes and tones in any given performance. Complete control and versatility is also a determining factor in theatre. Most "plug and play" systems do not have this feature. The goal of my Polygon Mapper within Isadora is to provide a program that houses all of these features in one piece of software.

PAGE 31

! $" 3.1. Isadora with proposed mapping software Isadora in conjunctio n with the Polygon Mapper actor offers versatile dynamic mapping ability with an unlimited number of vertices, map manipulation, media manipulation, effects, and adjustable anti aliasing. The current state of the actor does not integrate warping of the image it is mapping onto the object. Further iterations of the Polygon Mapper will include this functionality. It does, however, allow for concaved maps unlike other warped mapping software. 3.2. Characteristics for comparison In this sec tion, characteristics related to the mapping, manipulation, and cueing will de detailed. More details on how to map with the Polygon Mapper Actor can be found in the Appendix: Manual. 3.2.1. Dynamic Mapping For the purposes of this research project, dynamic mapping is the ability to manipulate the vertices of a map by clicking on the map itself, or a control panel with a diagram of the map. In the mapping progr am MadMapper the top screen is the

PAGE 32

! $# preview/control panel and the bottom screens is the outpu t screen Manipulation of the map happens on the preview/control pane l as well as the output screen T he program, Modul8 with free plugin MapMapMap also has a dynamic mapping interface where the mapping occurs on a control panel only I n other applications that have the ability to map such as Resolume there is interactive control but not dynamic control of the map. Interactive meaning manipulation is still don e by the user but not to the extent of dynamic manipulation when th e actual map is manipulated compared to the values associated with the map being changed The Image below illustrates the interface for Resolume and the mapping effect that is inherent in the program. Sliders on the right control vertices. Each vertex ha s an x and

PAGE 33

! $$ y slider, therefore, to map an object with 4 points, 8 different sliders must be set: top left X, top left Y, top right X, top right Y, bottom left X, bottom left Y, bottom right X, and bottom right Y. This is typically not an issue when mapping a small number of vertices such as 3 or 4, which is the extent Resolume allows. However, if you wanted to map an object with 27 vertex points, you would need 54 sliders. Yet, t here is value in having a slider for each coordinate. At times, dy namic mapping can lack precision if the object is smaller or more detailed. In these circumstances, being able to exactly pinpoint the value for the vertex is helpful. Programs such as MadMapper and Modul8 do not have this functionality. In the developme nt of the Polygon M apper for Isadora, both aspects for mapping, dynamic manipulation and value precision are available. This is a unique aspect for mapping that is not found in any software readily available to the public. When a patch is created (see ma nual for specific details on h ow to create a mapping patch), a Stage Mouse W atcher actor is used to allow the dynamic aspect of control. The S tag e Mouse W atcher is one of several actors inherent in Isadora. Dynamic manipulation is the primar y way for mapping using the Polygon Mapper in Isadora. Similar to other programs discussed previously, moving the vertices on the preview screen can create the map Manipulation can also occur on the output screen if desired. This is helpful if certain angles of the ob ject being mapped cannot be seen from the control computer's point of view. With a wireless mouse and keyboard,

PAGE 34

! $% mapping can occur without looking at a computer screen, but rather directly at the object being mapped. See Appendix: Mapping Photos for pictures of this feature. 3.2.2. Map Manipulation One feature that all mapping software has in common is the ability to manipulate the map once it has been created. Manipulation in this section is defined as translation, rotatio n, and scaling. Translation refers to the movement of the entire map from one location on the screen to another. Changing the horizontal and vertical positions on the Projector actor inside Isadora does this. The Projector actor is used to display t he image on the screen. The horizontal and vertical positions control where on the screen it is located. Rotation T he map can be rotated 360 ¡ in either direction. The spin input on the Projector actor controls the map's rotation.

PAGE 35

! $& S caling is the same as zoom. The map starts at 100%. It can be scaled down to 0% and be scaled up to 700% of its original size. Scaling, or z oom is also controlled by the Projector actor in the zoom input. 3.2.3. Me dia Manipulation Only Isadora and MadMapper have the ability to manipulate the media once it is inside the map. This is an important feature in time sensitive situations when there is no time to edit or crop the image. When an image is loaded into a map the entire ima ge become s the output map. Isadora and MadMapper have the ability to only show a portion of that image inside the map. If a director would like to test what the projections would look like if the image were cropped, that is very quick and easy to do with the media manipulation functions. Without these, the designer would have to take the image into editing software such as Photoshop, edit the image then upload it back into the program for the director to see. This process is eliminated with media manip ulation functionality. 3.2.4. Concaved Maps In its current state the Polygon Mapper does not have the ability to warp an image inside the map. It does, however, have the ability to create concaved maps, when the polygon cuts into itself. VPT is the only other mapping software that has the same characteristic 3.2.5. Anti Aliasing Anti aliasing is the technique used to smooth jagged lines of graphics. A detailed explanation of anti aliasing will be presented in chapter 3.3.4 The re are projection

PAGE 36

! $' design programs such as MadMapper and Modul8 with MapMapMap that has anti aliasing built into the graphics rendering. This effect smooths the perimeter of the map. MadMapper offers optimized anti aliasing and Modul8 has very limited anti aliasing in its map. Other mapping programs VPT and Resolume, have no anti aliasing. Even Isadora has no anti aliasing on basic shapes rendered inside the program. The Polygon Mapper is the only mapping program that has the ability to control the optim ization of the anti aliasing along the map edges. 3.2.6. Adding Effects Common to all mapping software, except for MadMapper, is the effects library that can be applied to the media inside the map Often times, more effects can be added than origina lly come with the program. These are called plugins. A popular plugin effects library is FreeFrame, which is available on Resolume, Pure Date, Jitter, Modul8, and Isadora. There are a variety of effects offered from FreeFrame including: PanSpinZoom, Kalei doscope, Fish Eye, Gaussian Blur, and Glow. My design of the Polygon Mapper takes advantage of these plugins to create a more robust mapping package. 3.3. Polygon Mapper Design This section will provide a comprehensive assessment of how the Polygon M app er was designed and functions, aspects that were touched on in the previous chapter. In addition, Isadora actors that are used in conjunction with the Polygon Mapper for editing will be detailed. The design premise of Isadora is the use of "buildin g block s" that can be configured and linked together to create actions. This project in lieu of thesis was the design and deployment of a new "block" allowing Isadora to have a new, unique functionality the Polygon Mapper actor

PAGE 37

! $( 3.3.1. Video In put Isadora uses the term "video in" to refer to a media input format that can accept movie and picture files. Isadora also has a Core Image (CI) upgrade that is available only on Mac that utilizes the GPU for image processing rather than the CPU C I a ctors are only compatible with other CI a ctors can only be used on Mac and are differentiated by the word "image in" instead of "video in." The type of media that can be used is the sam e in video and CI image inputs and generally, for theatrical projec tion design, there is no noticeable difference between the t w o. The Polygon Mapper was developed to be cross platform to allow for more widespread use in a range of theatres. Therefore, CI inputs were not used in the Polygon Mapper. The video in on the Polygon Mapper can be linked to the "video out" of a Picture Player or Movie Player actor Acceptable file formats for pictures and movies are: 3.3.2. M ouse X and Mouse Y Isadora simplifies its value and numbering system by viewing each screen o r stage ", as they are referred to in Isadora on a percentage scale from 0 to 100. No matter the resolution of the stage the values are converted to a stand ardized format. F or example, if the output stage is 320 x 240 the coordinates for the bottom right corner of the stage will be (100, 100) compared to (320, 240). Image files .JPG .PNG .PDF .PSD Video files .MOV .AVI .QTZ

PAGE 38

! $) The "mouseX" and "mouseY" inputs on the Polygon Mapper are designed to receive informat ion from the Stage Mouse Watcher. The Stage Mouse Watcher actor "watches" for the mouse to enter the stage and then reports the horizontal and vertical positions (X and Y). Those values are then transferred to the Polygon Mapper inputs mou seX and mouseY ." As vertices are mapped, the values from the Stage Mouse Watcher are recorded into the X and Y inputs for each vertex.

PAGE 39

! $* For added precision, the slider for each value can be used to pinpoint the location of the vertex. A value can also be typed directly into the input. 3.3.3. Key Code The "keyCode" input on the Polygon Mapper is the receiver for the trigger the Keyboard Watcher actor, which allows for cycling from one vertex to another for editing. T h e Keyboard Watcher "looks for keys on the computer keyboard to be pressed, released, or both. The key range' input property can be set to limit the range of characters that this watcher will see. When this watcher sees a character within the specified r ange, it will send the character that was typed out of the key' output." 80 T he Polygon Mapper is programmed to "listen for the left and right arrow s to be pressed on the keyboard. Each key has a specific value associated with it. The left key value is 28 and the right arrow key is 29. In the "key range" input on the Keyboard Watcher the value "28 29" must be entered and the "key" output will be connected to the "keyCode" input on the Polygon Mapper.

PAGE 40

! %+ The vertex that is being edited is shown with a gr een "edit dot." Idle vertices appear as red edit dots. The left and right arrow keys move back and forth from one edit dot to another in a complete circle. The right arrow key moves the active vertex clockwise, while the left arrow moves counter clockwis e. 3.3.4. Anti Aliasing The real world not seen through a screen is made up of an innumerable amount of shapes; straight, curved, continuous, broken, etc. The human eye can perceive all of these shapes. However, our method for viewing these in finite formations in a digital form is limited. 3.3.4.1 Anti Aliasing Defined Computer screens are made of discrete square pixels in a grid pattern, which are filled with light. When a vertical or horizontal line is drawn on a computer the pixel s are filled completely. (left) T he eye is not restricted to a grid the way digital sources are. Therefore, in order to get an accurate diagonal line, grid lines must be crossed. (right)

PAGE 41

! %" This is not possible on a computer monitor because each grid squ are is one indivisible pixel that cannot display more than one color. To the left is a diagonal line on a computer screen without anti aliasing. A nti aliasing is an effect used to make jagged diagonal lines appear smoother on a computer screen or other p ixel based devices. 81 In order for the diagonal line to appear smoother, the alpha channel (opacity) must be adjusted in the adjacent pixels. The line appears smoother because the edge is being blurred to some extent. (right) Below is an example using tex t. The red a' has anti aliasing applied to it, while the black a' has none. When made smaller, the red a' appears much smoother, while the black a' still looks jagged.

PAGE 42

! %# 3.3.4.2. Supersampling 82 Supersampling is a technique used in a nti aliasing. Samples are taken from each individual pixel to test two parameters of the pixel. Parameter 1: pixel location within the polygon. The first test determines if the given pixel is inside or outside of the polygon being displayed. Parameter 2: is the pixel an edge pixel. The test determines if the pixel is on the edge of the polygon, and calculates the color value that should be applied to that pixel in order to smooth out the jagged line caused by aliasing. The image below represen ts 1 pixel. A portion of the image being rendered crosses through this pixel. When no anti aliasing is applied the center of the pixel is tested to see if it should be red or white. In the case below, this pixel would be rendered as white since the tested sample is in the white section of the pixel. This is how the jagged lines in graphics are created.

PAGE 43

! %$ Supersampling collects the color information from multiple samples inside each pixel to determine the output color for that p ixel. In the example below the pixel has been split into 4 evenly sized sub pixels and samples are taken from the center of each. One of the four samples is in the red area. Therefore, this pixel will be filled with the color that represents of red. ( example of grid sampling) In the RGB (red, green, blue) color model, combinations of three values create the desired color. Each red, green and blue channel has 256 values ranging from 0 255. For instance, the values (255, 255, 255) prod uce white, while (0, 0, 0) produce black. Similarly, (255, 0, 0) equals true red, (0,255,0) true green, (0,0,255) true blue All other colors are variations in between those values One of the four samples taken from the pixel above lands on a red por tion of the polygon, and so the RBG value will represent of the red RGB channel. To achieve this, the program runs the following calculation: Full color value Total number of samples # Number of samples inside the color = Value of color for pixel 255 4 # 1 = 64

PAGE 44

! %% The gradient below represents the colors and values that could be applied to the pixel depending on the 4 samples that were tested. 83 3.3.4.3. Jitter Supersampling Optimized anti aliasing can be added in the f orm of jitter points. In the previous example of grid anti aliasing all four points are evenly spaced in the four quadrants of the pixel. The same principle would apply if the number of samples increased to 8, 16, and so on. All the points would be evenl y spaced. Jitter supersampling adds a randomization of points to get a better overall sample from the pixel. As seen here, it is possible with jitter supersampling to have a different number of points within the red, creating a differe nt color of red to be filled in this pixel.

PAGE 45

! %& 3.3.4.4. Jitter Points Source Code Programmers have worked with anti aliasing for many years and have created an optimized randomization for sampling. 84 These values were used in the source code for the P olygon Mapper and there are 8 different levels of anti aliasing incorporated into the actor. Each level has a different number of samples taken in each pixel: 2,3,4,8,15,24 and 66. The randomization of each sample is given an x and y value based on the coordinates inside the pixel. The values are from 0.5 to 0.5 for both x and y with 0,0 in the middle of the pixel since each pixel has a value of 1 by 1. sum= pointInPolygon (info > mCoordinates polySize, col 0.208147 row+ 0.353730 ) + pointInPolygon (i nfo > mCoordinates polySize, col+ 0.203849 row 0.353780 ) + pointInPolygon (info > mCoordinates polySize, col 0.292626 row 0.149945 ) + pointInPolygon (info > mCoordinates polySize, col+ 0.296924 row+ 0.149994 ); num_of_tests= 4 ; The values in blue repre sent the coordinates of the jitter sample points based on 4 samples. 85 Those points are plotted on the pixel below.

PAGE 46

! %' 3.3.4.5. Anti aliasing Within the Polygon Mapper Patches within Isadora vary in density based on the amount and size of media within it. A patch that contains still images will be slightly faster than those that contain video files. The same is true for patches with a combination of images and video plus various effects. Having a high level of anti aliasing can slow the computer's proc essor and cause lags. In live performances, it is imperative that cues trigger at the precise moment they are meant to. Too much lag in a show with data heavy media could lead to severe lags or a complete crashing of Isadora or the computer itself. To help compensate for the variety of performance types and machines that might be used, the Polygon Mapper was designed with an important feature: the user has the ability to choose the level of anti aliasing applied to the map depending on the how dense the pat ch is. The levels range on a scale from 0 7, zero representing no anti aliasing being applied to the map. From there the number of sample tests that are performed in the code increases to 2,3,4,8,15,24 and 66 jitters poin ts. An ti aliasing is only perfo rmed when the map is initially drawn and when there are subsequent changes to it. When mapping a still image this effect cannot be observed. However, if anti aliasing were constantly being applied, mapping a video file with changing frames would dramatical ly slow down the playback because the map would have to redraw every frame. Since anti aliasing is applied when the map is being drawn or redrawn, it is recommended that when mapping in Isadora, the anti aliasing be set to zero to avoid any issues. Changin g anti aliasing value temporarily slows media down to allow for recalculation of the alpha mask. Setting the anti aliasing to Level 7 (66 jitter points) during a data dense performance is discouraged since the number of calculations being performed on the pixels is substantially higher than at the lower anti aliasing levels (level 6 = 24 jitter points).

PAGE 47

! %( The difference in the levels of anti aliasing is apparent along the outer edges of the maps shown below.

PAGE 48

! %)

PAGE 49

! %* 3.3.5. F v alue The "fValue" input allows for t he "verticesCount" to increase or decrease by one this is necessary when adding and subtracting vertices from the total number needed In its current state, the Polygon Mapper's dynamic abilities are set within the code and cannot be changed by the user 3.3.6. Edit mode The "edit" input turns the edit mode on and off. Since the values are received by with a Stage Mouse Watcher it was important to have a way to turn off the edit mode while still mapping on the stage. This is done by right clicking at any point while the mouse is inside the stage ; any time this happens a trigger is generated. In order to utilize that characteristic, the "right mouse down" output of the Stage Mouse Watcher must be connected to the "trigger" input on the Toggle actor. The Toggle is designed to switch between t wo states, in this case on and off. The "trigger out" on the Toggle is connected to the "edit" input on the Polygon Mapper

PAGE 50

! &+ When the edit mode is on the green and red dots are seen at each vertex. Th ey disappear when the edit mode is off. Also, when the mode is off the stage mouse watcher still transfers values to the "mouseX" and "mouseY" inputs. The Polygon Mapper is programmed to not pass values on to the vertex inputs. If the edit mod e function is on as the mouse exits the stage the vertex point being edited follows the mouse off the stage, thus changing the map. 3.3.7. Dot Size The user has the option to choose the size of the edit dots for t he vertice s ranging from 0 5. Dot size 5 represents a 5x5 pixel box, the same being tr ue for the other sizes. T his functionality allows for more precise mapping of edges. The dot size can be reduced or eliminated to better see the corners of the map. It is diffi cult to map when using the zero dot size since you are not able to see which vertex is active, represented by the green dot.

PAGE 51

! &" Having control over the size of the dots also comes into play when having a large number of vertices. In the picture on t he left, there are 200 vertices with a dot size of 5. With such a large number of vertices, the dots appear as a solid line around the perimeter Lowering the dot size differentiates the vertex points more clearly. 3.3.8. Vertex Count Th e most unique aspect of the Polygon Mapper compared with all other mapping software is the nearly unlimited amount of vertices. In a single map there can be 999 individual vertex points. The Polygon Mapper actor interface was designed to be dynamic. Whe n a value is entered into the "vertexCount" input, the actor is populated with values for both the x and y coordinate of the vertex, labeled accordingly. The vertex coordinate for each point is equally distributed along the outer p erimeter. 86 The values fo r the distribution of vertices are recorded in the x and y inputs in the Polygon Mapper. The "vertex X" and "vertex Y" are seen in the pictures below. Note: the first

PAGE 52

! &# vertex i s "0" not "1." "Vertex 0 X" and "vertex 0 Y" will always have the coordina tes 0,0 unless they are mapped to a different location.

PAGE 53

! &$ The disadvantage to having a large number of vertices with a dynamic actor is that the actor itself becomes extremely long. Isadora accommodates for this, but precision mapping with the sl iders can be overwhelming. Fortunately, the main and recommended mapping technique is to work within the stage preview or output stage to dynamically map. 3.3.8.1. Adding and Subtracting Vertices Once a "verticesCount" has already been entered into the Polygon Mapper, the values automatically populate for equal distribution around the perimeter, as discussed in the previous section. This only occurs when changing the count from zero to another number (0 to X). The standardized user friendly form at for adding and subtracting has yet to be implemented. The proposed method is as follow s : The stage is broken into four quadrants. x+1 One n ew vertex is added to the far corner of the quadrant where the active edit dot is located. x 1 The activ e edit dot is removed. x+several Several v ertices are added to the far corner of the quadrant where the active edit dot is located. x several The active edit dot plu s the adjacent dots in the same quadrant are removed.

PAGE 54

! &% X + 1 X 1 X + several X several

PAGE 55

! && 3.3.9. Video Output The "video out" output on the Polygon Mapper transfers the original media with the new map to the Projector, the actor that displays the image. This P roje ctor actor controls which screen the image appears on through the "stage" input Isadora can have 6 different stages, meaning there can be 6 projectors controlled by the program. It also controls the map manipulation discussed in the Software Comparison. Translation, rotation, and scale are cont rolled by the "horz pos" and "vert pos," the spin input, and the "zoom" input respectively The "intensity" or opacity of the map as well as the height and width of the image Two very important inputs on this P rojector actor are the "blend" and "layer" inputs. The blend function has 3 settings: additive, transparent, and opaque. Additive simply blends the image with any other images that it is overlapping, while opaque completely covers other images over 100 % of the screen. The transparent mode is used when you have alpha masks in your image. The Polygon Mapper creates an alpha mask, therefore, if you have two overlapping masks the top projector must be set to transparent. The "layer" input also organizes multiple maps into a stackable layer. Up to 20 layers can be used

PAGE 56

! &' 3.3.10. Animated Manipulation and Effects 87 A unique effect in Isadora that can be applied to the map is the ability to animate the translation, rotation, and scaling parameters. No other software on the market has so much control over all aspects of a map. This is a contributing reason to why the Polygon Mapper was developed inside Isadora. For example, adding a Wave Generator actor to the spin parameter of the Projector actor will cause the entire map to spin 360 ¡ continuously. The same can be done for the translation and scaling of the map. The Wave Generator "generates a sine, triangle, sawtooth, square, or random wave cycle at a regular rate of speed ." 88 E ven the vertex coordinates can be automated by adding an Envelope Generator to the vertex value you want to manipulate. An Envelope Generator is an actor in Isadora that allows you to "smoothly ramp from a starting value to an ending value over a specified period of time." 89

PAGE 57

! &( In fact, any input with a val ue can be animated in Isadora, which opens up an unlimited number of possibilities for projection design. This relates to the idea of a "building block" system that allows the user to truly create art by having complete control over the medium. 3.3.11. Media Manipulation In addition to animating the map itself, the media that is inside the map can be translated, rotated, and scaled. This is a unique feature found only in Isadora in terms of mapping software. Using the FreeFrame plugin, the PanSpinZoom Actor, placed between the media player and the Polygon Mapper actor. If it were placed between the Polygon Mapper and the Projector, the PanSpinZoom actor would affect the entire map, not the media within it. Original Translate / Pan Rotate / Spin Scale / Zoom

PAGE 58

! &) 3.3.12. Mapping Concave Polygons The construction of the polygon map was based on the Point In Polygon algorithm. 90 Overall, this algorithm tests points to determine if they reside inside or outside of the polygon. If they are outside, the image being mapped is hidden. If inside, the image is revealed. Taking into account the coordinates of x and y for each vertex, the number of sides created by the vertices and the width and height of the stage the algor ithm is performed for all the coordinates in the stage as the map is being constructed. The source code is as follows: int pointInPolygon( float coordinates, int polySides, float xx, float yy, int width, int height) { int i, j = polySides 1 ; //delete 1 to occlude polygon// bool oddNodes = false ; float x=xx* 100.0 /width; float y=yy* 100.0 /height; for (i= 0 ; i < polySides; i++) { if ( (coordinates[(i 2 ) + 1 ] < y && coordinates[(j 2 ) + 1 ] >= y) || (coordinates[(j 2 ) + 1 ] < y && coordin ates[(i 2 ) + 1 ] >= y) ) { if ( ( coordinates[(i 2 ) + 0 ] + (y coordinates[(i 2 ) + 1 ]) / (coordinates[(j 2 ) + 1 ] coordinates[(i 2 ) + 1 ]) (coordinates[(j 2 ) + 0 ] coordinates[(i 2 ) + 0 ]) ) < x ) { odd Nodes=!oddNodes; } } j=i; } if (oddNodes== false ) return 0 ; else return 1 ; } The image to the left represents two vertices that have been mapped with the Polygon Mapper as indicated by the red edit dots. They are connected by a line, which do es not appear on the stage while mapping. For the Polygon Mapper to work successfully it needs to know which side of the imaginary line to allow the image to be seen.

PAGE 59

! &* While mapping, if you only allow for two vertices to be used, no image is seen sinc e a minimum of 3 vertices creates a polygon. This images below show a closed polygon. The program now must know where to put the image: inside the polygon or outside the polygon. In the Polygon Mapper this line of code determines that the image should be filled inside the polygon. If the 1 were deleted it would inverse the polygon map int i, j = polySides 1 ; //delete 1 to inverse polygon map // The determination for which side of the imaginary line is inside or out side is done by a si mple test. Points throughout the screen are tested (purple dots) and compared with the other Y coordinates that fall on the line created by the vertices (blue dots). If the number of blue dots on either side of the test point is odd, the point is inside t he polygon. If the number of blue dots on either side of the test point is even, the point is outside the polygon. This test allows the Polygon Mapper to create c omplex concave maps.

PAGE 60

! '+ In line A there are 2 blue dots on either side of the purple dot. An even number indicates that the point is outside of the polygon. In line B there are 3 blue dots on either side of the purple dot. An odd number indicates that the point is inside the polygon. Therefore, when the map is created it will look lik e the image below. Another example of a concaved map is seen below. The picture on the left represents the n on rectilinear vertex map. The picture on the right is the same map projected onto the object.

PAGE 61

! '" 3.3.13. Adding Effects A variety of effects can be added to the media itself. For a detailed explanation and other examples of how this is done in Isadora with the Polygon Mapper, see the Appendix: Manual. To add an effect in Isadora, insert the effect actor in between the m edia p layer and Polygon Mapper actor via the "rbg in" or "video in" inputs. 3.4. Cueing with Isadora The most theatre friendly aspect of Isadora is how clear and easy it is to cue a live performance. Using the Keyboard Watcher actor used previously, along with the Jump actor any production can be triggered with an unlimited amount of cues. Below is a very common method for cueing in Isadora. Each scene receives three jump commands: go to next scene, go to previous scene, go to first scene. The Jump actor has two modes, relative and absolute. Relative allows you to go plus or minus x number of scenes relative to your current scene location. This mode is the easiest way to set up linear cueing through scenes. Absolute mode a llows you to select the exact number of the scene you wish to jump. This is the ideal setting for moving to the beginning of the

PAGE 62

! '# production In either mode the use r can choose exactly which scene they wish to cue. Cueing can be done linearly or non linearly. This aspect is helpful if there are images repeated in a show with heavy media. Instead of adding another scene that is exactly the same Isadora can be programmed to jump to and from any location in the scene timeline The Jump actor also controls the fade between scenes. The Keyboard Watcher controls how the cues are triggered. In the example below the space bar triggers the next scene. This is done by entering space bar' it appears as in the "key range" inp ut. When using non numerical values in the Keyboard W at ch er single quotes must be on either side of the letter. Numerical values do not need single quotes. Note: As discussed previously, each key has a value associated with it. Zero 9 are the actual values 0 9, two digit values correlate with non numerical keys. Isadora can also trigger from a control panel that the user creates to fit the individual needs of the performance. During the ru n of the show, the only program interface that is seen is what the user designs.

PAGE 63

! '$ Cu eing from the Control P anel is don e by connecting Buttons in the Control P ane l with Jumps in the Scene Editor via Control IDs There is no need for Keyboard Watchers. Other devices can control the cueing of Isadora includ ing MIDI controls, joysticks, touchscreen applications, etc. 3.4.1. MIDI Control MIDI is a "protocol that allows computers and other electronic equipment to communicate and synchronize with each other." 91 Detailed explanation of MIDI control is beyond the scope of this paper. However, it should be noted that Isadora has the ability t o trigger other application such as Pure Data and Abelton 92 (a sound playback software). In a single press of a button, projections, lighting, and sound can be cued simult aneously. Isadora's cueing ability makes it the ultimate show control for an affordable price. Automatic, synchronized cueing eliminates the need for multiple board operators, a necessity when working on a small budget with limited resources. 3.5 Troubl e s hooting Throughout the design process several issues arose from the design of the Polygon Mapper to the practical deployment on the show control computer. This section will address several of the major difficulties and how they were resolved.

PAGE 64

! '% 3. 5.1. Tracking the Edit Dots An issue that was first encounte red when using the Stage Mouse W atcher was the vertex being edited would automatically jump to the location of the mouse. If the user is editing vertex 1 the mouse is located at tha t vertex. When the user uses the arrow keys to move to vertex 2, vertex 2 would relocate to vertex 1 because that is the location of the mouse. When initially editing a map this was not an issue However, it creates a very time consuming problem when fine tuning a map. The vertex would jump to the arbitrary point of the mouse thus limiting the ability to slightly adjust a vertex quickly. This issue was resolved by creating a "sticking" effect with the mouse and edit vertex. Once a vertex is made active, the "dot" will stay in place until the mouse moves within a 25 pixel zone of the vertex. At that point, the "dot" will attach itself to the mouse and follow it to the desired location. Th is function prevents the vertices from jumping to random and unwanted coordinates. Further testing on a large scale revealed that using a 25 pixel "zone" was too large. It was scaled down to 10 pixels for more exact mapping. 3.5.2. Resolution Issues Standardizing file formats and resolution sizes is a common practice in digital media All files used in a single performance should be the same resolution: 320 x 240, 480 x 640, etc. However, this is not always possible nor should a lack of standardi zation result in an improper functioning of the software. In previous versions of the Polygon Mapper, the actor expected the same resolution size for the media. If a 320 x 240 image was uploaded into Isadora and then mapped, changing the image in side the map to an other 320 x 240 im ages would have no effect on it

PAGE 65

! '& However, changing from a 320 x 240 image to a different resolution size for example 640 x 480 would randomly do one of t w o things. Relocated and resize the map bas ed OR An image similar this would appear on the new resolution. The former image represents the map at 320 x 240 appearing with the same dimensions only on a larger image of 640 x 480. The latter is a more confusing image to deciphe r. When an image is loaded into the actor, an image buffer is allocated for that image. An image buffer is a piece of memory in the computer that holds that image s resolution. If the initial image is 320 x 240, the computer memory allocates an array of data at those dimensions. When the image is changed to 640 x 480 the image buffer remains at 320 x 240. As the computer

PAGE 66

! '' tries to display the 640 x 480 dimensions it reads the first two lines of 320 pixels as the first line of 640 pixels. The array in the image buffer can be thought of as boxes, each representing one pixel. For the illustrated example below the image size is 16 x 12. Note: In the computer memory the "boxes" are actually in a continuous line. This blue and red striped im age is loaded into the Polygon Mapper. A 16 x 12 image buffer set of "boxes" is allocated to each pixel in the image. In the computer's memory the first line of 16 "boxes" is red, the second line is blue, etc. The same image with a resol ution of 32 x 24 is loaded into the Polygon Mapper after an image buffer has already been allocated, the new image buffer will use the boxes that are already loaded to try and fill the image. The first red line of 16 boxes will remain, but in an image wit h 32 pixels in the first line, the second set of 16 pixels, which is blue, will be used to finish the remaining line of 32. Once 6 lines of 32 pixels have been used for the new image, the original image buffer has no more memory assigned

PAGE 67

! '( to that image. T herefore, the computer fills in the remaining lines with anything that is in the computer's random access memory ( RAM ). To avoid this issue, all images loaded into the Polygon Mapper are made resolution agnostic by turning their resolution into a perce ntage on a scale of 0 100. 3.5.4. Polygon Mapper Unlinking Bug The only issue, or bug, in the code for the Polygon Mapper that affects functionality is the unlinking of the Polygon Mapper from the Projector once mapping has been done. After the value s of the map have been set in the Polygon Mapper, if you delete the link between the Polygon Mapper and the Projector, Isadora will crash. This crash does not seem to happen when you delete any other links from the Polygon Mapper. I suspect this happens because of a discrepancy in the way the map is drawn and displayed. Therefore, when th e display is interrupted, unlinking the Projector, the Polygon Mapper can no longer output the image and crashes. Further testing must be done to debug this issue. S ince Isadora is so versatile, there is a way around this. A User Actor is a subpatch that can be added to any patch, two or more actors connected together by a link. This subpatch holds additional actors that have certain functionality in the patch as a whole. The Polygon Mapper, along with the actors used to control it, can be placed inside a User Actor. This avoids the Polygon Mapper from ever being unlinked directly from a Projector.

PAGE 68

! ') Double click ing on the User Actor opens it and the entire P olygon Mapper patch should be placed inside. A User Input and a User Output connects to the "video in" and "video out" on the Polygon Mapper. If you unlink the Polygon Mapper User Actor from the Projector, Isadora will not crash. 3.5.4. Isadora Issues with Show Control Computer The show control computer for this project in lie u of thesis is a Mac Pro OS X 10.6.8 with the most recent version of Isadora installed, version 1.3 of 21. Isadora and the Polygon Mapper had been functioning pr operly with the Mac for weeks. Unexpectedly, Isadora began to crash. Thinking there was a bug in the Polygon Mapper actor, it was removed from the computer This did not rectify the problem. All possible reasons for the crash were meticulously ruled ou t: the Polygon Mapper, corrupt media, necessary software updates, video format issues T o no avail, Isadora continued to c rash. Eventually, the problem was narrowed down to the Pop up Toolbox (right) Any time the user double clicked inside the Scene E ditor to get the Pop up Toolbox, Isadora would freeze

PAGE 69

! '* and inevitably crash. However, if actors were retrieved from the actual Toolbox on the left side of the interface, Isadora responded as expected. With this information, a crash report was submitted to Mark Coniglio, the creator of Isadora, outlining the issues. Within the hour he responded with more questions about the nature of the crash and asked to run a program that would compile the previous 3 crash reports and send them to him. The crash repo rts indicated that the issue was indeed with the Pop up Toolbox. Mark, however, was not able to duplicate this problem, nor had he heard of any such issue in the past. Over the next two days, Mark and I worked together to fix the problem. Ultimately, Ma rk discovered that the issue was the drawing of the alpha blended edges on the toolbox, a cosmetic characteristic of the program. He made a few changes in the code and the new version worked on the show control compute r. T hat version was later released t o the public as version 1.3 of 22 3.6 Conclusion of Development Process The Polygon Mapper actor similar to all the actors in Isadora, is a building block. The examples given in this paper are merely a small piece of what can be created. In the han ds of a seasoned Isadora programmer, the functionality can be altered or changed all together. The possibilities are truly limitless because of the expandability of Isadora Chapter 4. The Last Unicorn : A Performance Within a Project Equally as import ant as the technology created for this project, was the practical deployment of the Polygon Mapper in a live performance. In conjunction with the Digital Worlds Institute and the School of Theatre and Dance (SoTD) at the University of Florida (UF) the Polygon Mapper with Isadora was integrated into the production of The Last Unicorn.

PAGE 70

! (+ The Last Unicorn was originally a novel written by Peter S. Beagle 93 and published in 1968. It is a timeless fantasy story that f ollows the journey of a unicorn who believes that she is the last of her kind and sets off to find others like her. The novel gained huge success and in 1982 94 a n animated film was released with Mia Farrow as the Unicorn and other vocal talents such as Jeff Bridges and Ang ela Lansbury. Since 1968, the novel has never been out of print and in January 2011 a graphic novel was released In 2002, The Last Unicorn was adapted for the stage by playwright, Le Anne Garland, and since then only 2 staged readings have been perform ed. It was an honor to be granted permission to produce a studio theatre production as the vehicle by which to showcase this project in lieu of thesis. Together, with the Tiza Garland an Associate Professor at UF 's School of Theatre and Dance who agreed to direct the play a design team and ensemble was brought together to create a production that could transform a flat empty performance venue into a 3 dimensional world 4.1 Installations Designing the installations for the performance of The Last Unicorn was a complex process with many things to consider. The production space was at the Digital Worlds Institute in the Research, Education, and Visualization Environment (REVE). The REVE is unique becaus e it has five 17' x 17' screens that surround the stage to create a semi immersive environment. The plan was to incorporate the use of the REVE in house screens and projectors as well as two other projectors for the installations. The shape of the final installations, Tetris like configurations, served as both set pieces and projection surfaces. This

PAGE 71

! (" entire performance was produced with no budget. The wood, construction space, housing space for the installations was all borrowed or donated. The entire set was built out of scrap pieces of plywood. The REVE is a functioning classroom and theatre. Therefore, the set had to be constructed in a way that allowed for easy break down and condensing. This led to the modular design. There are 15 separate p ieces that connect together to build the 3 installations. They are connected by loose pin hinges or barrel bolts that allow for fast and easy break down. At the beginning and end of every rehearsal the set was taken apart and move ed out of the way for th e following day's events. If needed, the set could be compact into itself and be housed in a 10'(w) x 3'(l) x 5'(h) space. The 5 in house projectors are mounted to the ceiling roughly 18' high. The sight lines for the REVE house projectors were taken into consideration when building the set. It was determined that the installations had to be 8 feet away from the wall to be 6 feet high. If the installations were any taller or closer to the screen, the REVE house projector image would be seen on the t op of the installations. The installations also had to be sturdy enough to withstand multiple actors sitting, standing, climbing and jumping on at the same time. It was made out of $ and % plywood and the modular design help ed the soundness of the se t by offering more weight b ea ring supports compared to building it as one unit.

PAGE 72

! (# The installations needed to have a functional use within the storyline. After much collaboration with the Director, it was deemed necessary to have three different types of installations. The first was the "Arch" or far stage left installation. At several times throughout the play, the scene takes place at an entryway of some kind It was important that the installations created a 3 dimensional playing space and not sim ply have the setting be seen on the 5 screens in the background. The installations became the entrance to the castle and the doorway to the scullery. The overhanging feature of the arch provided that projection space without being overt. The goal was to have a set piece that could be a doorway one minute and a tree the next. Next to the Arch was another tall installation piece that helped to frame the feeling of a doorway, the center stage left installation (the "Tall One"). It also lends itself to the several forest scenes with trees projected on it. At one point in the show, a tree comes to life, so the actor was able to perch herself onto the installation and have the tree projected onto her body. Both stage left installations are 3' x 3' x 6'. An installation that allowed for more interaction with the actors was created for stage right. The installation was longer than it was tall for actors to stand or sit on, use as a table, etc., the stage right installation was designed with the dimensio ns of 8' x 5' x 4'. Several times,

PAGE 73

! ($ throughout the performance, the stage right installation or "mound" needed to change to c ompliment the action. For example, when the k ing sits in his throne, the center section folds back to give the semblance of a thro ne. There are two different moving pieces on the "mound" which are changed throughout the play to fit the needs of the scene. All four configurations can be found in the Appendix: Installation Photo Gallery. When first designing the moving components I was concerned about the ability to standardize the different configuration. If the reconfiguration was not in the same place as previous scenes then the projections would be off. To avoid this issue, each moving piece is attached to a piano hinge, which regulates the movement so the map and the set will always match. Two Christie projectors were mounted to the ceiling of the REVE below the center in house projector. These projectors were used to project onto the installations. The left projector wa s used for the mound and the right was used for both stage left installations. When determining the final placement of the set pieces, t he image size of the projectors needed to fill the entire space of the installations. With a throw distance (distance from projector to destination) of approximately 20 feet, the image size of the Christie projectors was 18' $ x 14' $ ". 4.2 Projectors, Sight Lines, and Shadows Placement of the installations was determined by the image size of the Christie projec tors, the sight lines of the in house projectors, as well as, the shadows created by the installations themselves. In the design of the installations, the shadows that would be created if a higher portion of the installation protruded out farther than a lo wer

PAGE 74

! (% portion were considered The installations were designed so that the deeper portions were at the bottom and it became more narrow as they got taller. Generally when designing an installation the projector is pointed straight at the object eliminatin g shadows of this nature. However, working within the confines of the space, the installation projectors were mounted on the ceiling and aimed downward at an approximately 20 ¡ angle. In addition to the vertical angle of the projector, the horizontal angle of the installations had to be modified to minimize shadows on the set pieces. In the original location, there were several shadows created by the installation. After turning the installation angle slightly the shadows disappeared. Going into this staging of The Last Unicorn the director and I acknowledged that sight lines would be an issue, both for the audience and for the actors. The installations go all the way from the floor to between 4 and 6 feet high. If the audience sits in no n raked seating, seeing the lower portion of the installations would have been an issue. To avoid this, all the seating was removed from house and audience members were seated on the floor. There were chairs in the back of the house for elderly patrons. Floor seating provided a more casual atmosphere that the play required. It also rectified most of the audience sight line issues. Since the actors' playing area was on, around, and in front of the installations, avoiding the projections hitting the act ors was impossible. Of course, when an actor

PAGE 75

! (& stood next to the installation, there were no issues with the projections because they were mapped exactly to the set pieces. Standing in front of or sitting on the installations, on the other hand, meant the projection would be directly on the actor's person. To combat this artifact of circumstance, a balance had to be created between projections and lighting. Moments where the action was more significant, actors would be well lit by the lighting, thus washin g out the projections on their body as well as the installations, and visa versa. Projections being washed out by lighting versus actors not being well lit is a constant battle in theatre. Having a projector with a high luminosity or brightness helps t o overcome this problem. The Christie installation projectors have 5800 lumens. 95 To put that in perspective, a standard 40 watt incandescent light bulb has 500 lumens 96 and the 16 projectors used in Disney's castle mapping performance each have 20,000 lu mens The projectors used for the installations in The Last Unicorn are decent projectors however, any Digital Light Processing (D LP ) projector can be washed out with enough environmental lighting. 4.3 Projection Artwork The goal for the staging of T he Last Unicorn was to use only original artwork. To do this, a Scenic Content Designer Elaine Sponholtz, a Master's graduate from the Digital Worlds Institute was brought on to the design team to design and paint t he artwork that w ould become the backdrops for the scenes and the installations. A very important technical aspect of creating physical artwork for projected imagery is ensuring the digit i zation of the art is created to the exact resolution size as the final projection screen. The 5 screen system at the REVE is 7000 pixels by 1050 pixels. Each screen is 1400 x 1050. A single contiguous backdrop was created by painting 5 individual panels that fit together to make one scene. Each panel was the exact dimensions of one screen, 19.4" x 14.6".

PAGE 76

! (' The first attempt to digitize the art was via scanning each panel in an 11" x 17" scanner and blending the edges together. The scanner resolution was set much higher than the screen resolution. When the scan image tests were pro jected a noticeable m oirŽ patterns appeared in the art. MoirŽ patterns (see image on right 97 ) are created when a pattern in the subject overlaps an opposing pattern in the medium. 98 For example, the pattern in paintbrush strokes is offset from the lines i n a scanner or computer screen. Ensuring matching resolution for all artwork and projector is the only way to eliminate the moirŽ artifact. Photographing the artwork was the best way to digitize for our purposes. It also allowed us to standardize the resolution without having to blend edges together. 4.4 Installation Artwork The artwork for the installations was specifically created with the set pieces in mind. Using the silhouette of the installations as guide, the art took similar form to its surface. The Christie projectors were at a resolution of 1280 x 1024 and the artwork for it was the same. The installation artwork was treated as an extension of the 5 screen backdrop. At times the backdrop was a forest, a road, a scullery, or a cave. The installation artwork coordinated with a tree, haystacks or a cupboard. When it was not necessary for the installations to be specific set pieces, textures were projected onto them to set the atmosphere of a cave or beach.

PAGE 77

! (( Conclusion This projec t is unique because it is not based solely in theory or in a p roof of concept. An accessible, controllable piece of tec hnical theatre has been created. The Polygon Mapper expands the scope of the industry by pulling the projections out of the background and into the action, while still providing the balance and control that is needed in a Total Theatre environment made available to all types of theatres on any budget This project in lieu of thesis represents time well spent in graduate school and the creation of a new career. In the words of Zachary Borovay: I am a projection designer. I am not a lighting designer, although I use a specialized lighting instrument to convey my design. I am not a scenic designer, although my imagery can be graphic or scenic in nature. I am not a sound designer, although my media may include an audio element. I do not see my job as a stepping stone to any other discipline. I do not aspire to be a lighting, scenic, or sound designer. I am perfectly happy being a p rojection designer. 99 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!! 1 Bursill, Web. 2 Magic lantern Wikipedi a. 3 Szanto, 23. 4 Innes, 23. 5 Dawson, 1. 6 Innes, 184. 7 Emily Mann's credits include Still Life (19), earning Mann 6 Obie Award nomination s, including Distinguished Playwriting and Distinguished Direction and Having Our Say (1995), nominated for 3 Tony awards including Best Play and Best Director. LoBiondo. 8 Dawson, 5. 9 Dawson, 8. Partial table from text. 10 Piscator, 182.

PAGE 78

! () !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!! 11 Ibid. 12 Erwi n Piscator (1893 1966) 13 Innes, 16. 14 Tytell, 27 15 Innes 17. 16 Hopkins, Introduction 17 Hopkins, 1. 18 Esaak, Web. 19 Ibid. 20 Hopkins, 11. 21 Innes 17. 22 Probst, 29. 23 Innes, 17. 24 Innes, 18. Other influences include: the exaggeration of physical character through masks, experimentation with gramophones, breaking the illusion between stage action and audience in order to alienate the audience. 25 Innes, 18. 26 Innes, 107. 27 Probst, 9 10. 28 Dawson, 59 29 Probst, 16. 30 Probst, 14. 31 Leverich, 440. 32 Leverich, 3 46. 33 Bloom, 23. 34 Kramer, Web. 35 Savran, Introduction 36 Dawson, 60. 37 Beck, Web. 38 Savron, Introduction 39 The Builders Association Web. 40 Innes, 189. 41 Innes, 198 42 Innes, 199 43 Brecht, 77 44 Innes, 192 3. 45 Zachary Borovay Personal Website 46 Peter Fla herty Personal Website 47 Wendall Harrington Personal Website 48 Dodson, web introduction. 49 V Squared Labs 50 1024 Architectur e 64 "Troika Tronix Isadora." 65 Proposed software developed and outlined in this thesis 66 Mad Mapper

PAGE 79

! (* !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!!!!! 67 Modu8 VJ Software 68 Modul8 with MapMapMap Blog. 69 Resolume VJ Software. 70 VPT 6.0. 71 Ibid. 72 Cycling 74. 73 Quartz Composer 74 Figure 53, QLab. 75 Dataton Watchout. 76 Green Hippo. 77 Troika Tronix Isado ra" Web. Software description. 78 Syphon 79 Green Hipp o. 80 Isadora. Keyboard Watcher Actor Help Description. 81 Spatial Anti aliasing Wikipedia 82 Supersampling Wikipedia 83 Decimal values are rounded up, the largest value is 255 84 "Jitter.h," o ptimized jitter point values. 85 The other jitter point tests can be viewed in the source code appendix. 86 Algorithm created by Angelos Barmpoutis 87 Video clips of animated map effects can be found at htt p://vimeo.com/brittdesign/videos 88 Isadora. Wave Generator Actor Help 89 Isadora. Envelope Generator Actor Help 90 Finley, sample source code. 91 MIDI Wikipedia 92 Abelton website 93 Beagle, The Last Unicorn, novel 94 The Last Unicorn animated film. 95 "Chri stie Projectors" MSRP. 96 Incandescent light bulb Wikipedia. 97 Photography Credit, "Moire on Parrot Feathers.jpg 98 MoirŽ pattern Wikipedia 99 Borovay, Web.

PAGE 80

A PPENDIX A : Manual Introduction 82 SECTION 1. Isadora Installation 82 1.1 System Requirements 83 1.2 Download 84 1.3 Installation 84 2. FreeFrame Plugins Installation 86 2.1 Download 86 2.2 File Path 86 3. Polygon Mapper Plugin Installation 86 3.1 File Path 86 4. Using the Polygon Mapper 86 4.1 Isadora Graphical User Interface 8 7 4.1.1 GUI Terminology 8 7 Scene Editor 8 7 Toolbox 8 7 Toolbox Filter 8 7 Pop up Toolbox 8 7 Scene List 8 7 4.1.2 Other Terminology 88 Actor 88 Stage 88 Patch 88 4.2 Importing Media 89 4.2.1 Acceptable file formats 89 4.2.2 Media Bin 90 4.3 Creating the Patch 90 4.3.1 Retrieving the actors 90 4.3.2 The Actors 92 Media Player 92 Polygon Mapper: Dynami c 92 Projector 92 Stage Mouse Watcher 92 Keyboard Watcher 92 Toggle 92 Pan, Spin, Z o o m 92

PAGE 81

! "# 4.3.3 The Stage Setup 92 4.3.4 Step by Step Patch Build 95 Step 1: Add Media Player 95 Step 2 : Load Media 95 Step 3: Add Polygon Mapper 96 Step 4: Add Projector 96 Step 5: Add Stage Mouse Watcher 97 Step 6: Add Keyboard Watcher 97 Step 7: Add Toggle 98 4.4 Creating a Polygon (Continued from section 4.3.4) 98 Step 8 : Adjust Dot Size 98 Step 9: Add Vertices 98 Step 10: Show Stages 99 Step 11: Turn on Edit Mode 99 Step 12: Map Vertex 99 Step 13: Move to Next Vertex 99 Step 14: Map remaining Vertices 100 Step 15: Turn off E dit Mode 100 Step 16: Turn on Anti Aliasing 101 4.5 Aligning Media 101 4.6 Adding Effects 102 Footnotes 103

PAGE 82

! "# This manual details the download, installation and use of Isadora created by Mark Coniglio, FreeF rame plugins crea ted by Pete War den, and the Polygon Mapper p lugin created exclusively for Isadora. The Polygon Mapper plugin was the subject of a project in lieu of thesis presented to the College of Fine Arts of the University of Florida in partial fulfillment of the req uirements for the degree of Master of Arts in Digital Arts and Sciences under the title NON RECTILINEAR PROJECTION DESIGN FOR LIVE CUE ABLE THEATRICAL PERFORMANCE by Brittany Powell, December 2011. 1. Isadora Installation Isadora is the award winnin g, graphic programming environment for Macintosh and Windows that provides interactive control over digital media, with special emphasis on the real time manipulation of digital video. Because every performance or installation is unique, Isadora was desig ned not to be a "plug and play" program, but instead to offer building blocks that can be linked together in nearly unlimited ways, allowing you to follow your artistic impulse. 1

PAGE 83

! "# 1.1 System Requirements Please check the system requirements before downloading Isadora. Version System Requirements 2 Mac OS X Standard Requires OS X 10.3 or great er Intel or Power PC Processor 1.0 Ghz / 1.0 GB RAM (minimum) 2.0 Ghz / 2.0 GB RAM (recommended) Quicktime 7 minimum, 7.5 recommended Mac OS X Core The Core version of Isadora leverages the Mac OS X operating system by "adopting" all Core Imag e, Quartz Composer, and Core Audio plugins found on your computer, making these modules available within Isadora. (This includes third party plugins, as long as they are installed in the standard locations.) The Core Video feature (Core Image + Quartz Comp oser plugins) costs US$ 25 per license; the Core Audio features cost an additional US$ 25 per license. You may purchase these upgrades when you order the Mac OS X Standard Version, or you may add them at a later time. Requires OS X 10.4 or gre ater Intel or Power PC Processor (Core Duo recommended) 1.0 Ghz / 1.0 GB RAM (minimum) 2.0 Ghz / 2.0 GB RAM (recommended) Quicktime 7.5 or greater Windows 7 /Vista /XP Windows 7 / Vista / XP Intel Pentium 4 minimum, Core Duo recommended 1.0 Ghz / 1.0 GB RAM (minimum) 2.0 Ghz / 2.0 GB RAM (recommended) Quicktime 7.5 or greater USB Key Version All Isadora versions may be ordered with an optional USB Key. If you must frequently move your working environment from computer to computer, then pur chasing a USB Key may prove useful for you. You may also choose to switch to the USB Key version at a later date. Please make sure to read the USB Key

PAGE 84

! "# Policy 3 before purchasing a key. 1.2 Download latest pre release version Visit the Troika Tronix w ebsite at http://www.troikatronix.com/izzy download.html to download Isadora. You may download and use the demo version of Isadora for free. It has all the functionality of the full version with the exception of being able to save your work. You must register and purchase a license to enable the save function. Downloading the OS X Core version requires an additional audio and vide o upgrade available for purchase. The latest pre releases can be found at http://www.troikatronix.com/izzy prereleases.html Currently, the most recent pre release is version 1.3 of 19 r eleased on September 29 th 2011. 4 1.3 Isadora Installation (If Isadora has already been installed, please skip to section 2.0 on pg. *) The following is the downloading instructions for Mac OS X. Windows Installation will vary slightly. Downloa d the .dmg file for installation. If you do not see the Introduction page automatically, click on the installer icon on the desktop, then click on the "Isadora Core Installer" button at the bottom of the window. The Introduction begins the installat ion. Press continue.

PAGE 85

! "# Follow the prompts through the installation process.

PAGE 86

! "# 2. FreeFrame Plugin Installation (If Isadora has already been installed, please skip to section 3.0 on pg. *) FreeFrame is an ope n source cross platform real time video effects plugin system. 5 2 .1 Download and Install For op timal use of the Polygon Mapper download Pete Warden's FreeFrame Plugin update found at http://www.troikatronix.com/izzy prereleases.html O nce the .dmg file has been downloaded follow the installation guide in 1.3 2.2 File Path The FreeFrame plugins should automatically be put in the proper file path: Library/Applicati on Support/FreeFrame/ 3. Polygon Mapper Installation This plugin will allow for the mapping of objects with a nearly unlimited number of vertices. 3.1 File Path Place the .izzyplug file in the same file as the FreeFrame Plugin. File path: / Library/Application Support/Free Frame/ 4. Using the Polygon Mapper (If you are f amiliar with the GUI of Isadora, skip to section 4.3) The Polygon M apper works in conjunction with Isadora actors and FreeFrame plugins to produce a multi point object mask with a high level of versatility and control.

PAGE 87

! "# 4.1 Isadora Graphical User Interface Once Isadora, FreeFrame plugins and the Poly gon Mapper plugin have been inst alled restart Isadora. The GUI will appear on your screen. 4.1.1 GUI Terminolog y A. Scene Editor This is where a scene is created. A scene is a container that has several actors connected to create a patch that formulate a certain situation. B. Toolbox This is where all the actors inside Isadora reside. C. Toolbox Filter Filters the actors into 11 different groups based on its functionality i.e. video group, communication group. D. Pop up Toolbox double clicking inside the scene editor brings up the pop up toolbox. Type a few letters of the actor you want to place in the scene and the list will populate with all actors fitting the spelling, then arrow down and select enter for the actor to be placed in the scene editor. E. Scene List a time line that holds all the different scenes for your show that you can move linearly or to specific scenes backwards or forwards.

PAGE 88

! "" 4.1.2 Other Terminology Actor : Also called modules. Actors are visually represented by a dark grey box (blue box when selected) containing inputs and /or outputs that allow for something to a ppea r, an effect to be applied or a change to occur. Stage : Stages are the output screens where the media is sent. Isadora allows for 6 stages in addition to a control screen. Changes to the stages can be made in the Preferences window. Go to Preferenc es in the Isadora Toolbar / Stage. Each stage can be placed on any display that is connected to the main computer. For example, If you have a c ontrol computer connected to a P rojector, stage 1 would be placed on' Display 2. The control screen or main d isplay' is not considered a stage. Also, you can control the size of the stage at the top of the preferences window or with the drop down menus by each stage. Patch : A Patch is two or more actors connected together by a link.

PAGE 89

! "# 4.2 Importing Media After you have opened Isadora, you need to import media before you begin mapping. It is not imperative to import all the media or even the final media. You can simply import a placehold er so the mapping can be done. Go to File / Import Media. Shortcut: Command/Shift/I on a Mac Control M on a PC. The browser for importing media will appear. 4.2.1 Acceptable media file formats MOV files should be Quicktime or Photo jpeg. Be cautious when using H264 movies, especially with a PC. There is a bug in Isadora that may cause it to crash. ** M4V files play audio but they must be played with a Sound Movie Player' Actor or a Movie Player' Actor. This differs from other Audio files, which must be played by a Sound Player' Actor. Image files .JPG .PNG .PDF .PSD Video files .MOV* .AVI .QTZ Audio files .WAV .MP3 .M4V** 3D Files .3DS MIDI Files .MID

PAGE 90

! "# 4.2.2 Media Bin Once the media has been imported it will be placed in the appropriate bin : Video Files, Audio Files, MIDI Files, Pictures, 3D Models. 4.3 Creating the Patch This next section will explain the steps to build the patch to map a polygon. There are 7 different actors used in the creation of this patch. Please make sure both the P olygon Mapper and the FreeFrame plugins have been installed before continuing. (See sections 2 and 3) 4.3.1 Retrieving the actors As discussed in section 4.1 on the GUI, there are three ways to retrieve an actor to place in the scene editor. The fi rst way is using the toolbox itself. Simply scroll through the list of all the actors in alphabetical order until you find the one you need. Click on the name and a green circle with a plus sign will appear. You do not need to drag and drop the actor. Move the green plus sign into the scene editor and you will see the actor appear next to it. Click the spot inside the editor where you would like the actor to go.

PAGE 91

! "# The second way is using the toolbox filter. Type the name of th e actor you need in the box and the toolbox will filter the list. Click on the name in the list and drop it into the scene editor.

PAGE 92

! "# The final and most efficient way to retrieve an actor is to use the pop up toolbox. Do uble click inside the scene editor. Be sure to click on the spot where you want the actor to go. Type the first few letters of the actor you want and the filtered list will appear. Arrow to the appropriate actor and press enter. The actor will appear in that exact spot. 4.3.2 The Actor s Media Player It can be a Picture or Movie P layer. The Picture Player simply "outputs a picture imported into the media [bin] as a video stream." 6 The Movie Player "allows p layback control of the movie imported into the media [bin]. The visibility, speed, loop points and the position of the movie can be modified." 7 P olygon Mapper: Dynamic The Polygon Mapper allows for a nearly unlimited number of vertices making up a pol ygon to be dynamically added, mapped and filled with content.

PAGE 93

! "# Projector The Projector "positions, scales and renders a video stream to a specified stage." 8 Stage Mouse Watcher The Stage Mouse Watcher "sends information about the mouse when it is wi thin the stage specified by the stage' input. It reports mouse clicks and releases, the mouse position and whether or not the mouse is within the stage. K eyboard Watcher The Keyboard Watcher "looks for keys on the computer keyboard to be pressed, rele ased, or both. The key range' input property can be set to limit the range of characters that this watcher will see. When this W atcher sees a character within the specified range, it will send the character that was typed out of the key' output." 9 T oggle The T oggle actor toggles between an on and off state each time a trigger is received from the trigger input." 10 PanSpin Zoom PanSpin Zoom is an optional FreeFrame plugin that allows for the manipulation of the media independent of the map. As in dicated by its name, this actor can pan, spin or zoom the picture or video file. 4.3.3 The Stage Setup If you are simply testing the Polygon Mapper without a P rojector connected then the stage window will float on the main display.

PAGE 94

! "# S how s tages by going to Output > Show Stages. Shortcut: Command + g on a Mac Control + g on a PC If you would like this display to be bigger go to Isadora > Preferences > Stage > Stage Size and change the dimensions. Be sure Stage 1' is placed on Di splay 2' even if you do not have a second display connected. Be sure to SELECT Floating Stage Windows' so the stage window is visible when editing the patch. Be sure to DESELECT Hide Cursor When Full Screen' under General Stage Options. It is necess ary to see the curs or at all times when using the Polygon M apper.

PAGE 95

! "# 4. 3.4 Step by Step Patch Build Step 1: Add a Picture P layer to the scene editor. For the pur poses of initial mapping use a Picture P layer to avoid having a moving images to map. Step 2: Load the image into the Picture P layer. Control + M brings up the media bin. Insert the number 1 in the picture' input of the Picture P layer and press enter Note: Use the corresponding number to the desired picture.

PAGE 96

! "# Step 3: Ad d a Polygon Mapper Actor to the scene editor. Connect the video out' output from the picture to the video in' input on the Polygon Mapper by clicking once on the output dot on the Picture P layer and then clicking a second time on the input dot. Step 4: Add a Projector to the Scene Editor and connect the video out' from the Polygon Mapper to the video in' on the Projector.

PAGE 97

! "# Step 5: Add a Stage Mouse Watcher to the Scene Editor. Connections: Stage Mouse Watcher Polygon Mapper ho rz. pos' mouseX' vert. pos' mouseY' Step 6 : Add a Keyboard Watcher to the Scene Editor. Click on the black box to the left of the key range' and type in 0 255 and press enter. This allows for all the keys to be watched. Connectio ns: Keyboard Watcher Polygon Mapper key' keyCode'

PAGE 98

! "# Step 7 : Add Tog gle to the Scene Editor. Connections: Stage Mouse Watcher T oggle Polygon Mapper right mouse down' trigger' trigger out' e dit' 4.4 Creating the Polygon Once the Patch has been completed it is time to start creating the polygon map. Step 8: (Continued from section 4.3.4) Change the dotsize' to 5 if you are mapping something large, 1 if something small, or so mewhere in between depending on the size or the object. Step 9: Add vertices in the verticesCount' input. Click inside the black box to the left of verticies enter a number between 3 and 999. Press enter. For this example we will begin with 6 ver tices Each vertex is represented by an X and Y input labeled vertex 0 X, vertex 0 Y. Vertex number 1 is referred to as zero.

PAGE 99

! "" Step 10: Show stages Command + g / Control + g The full image will appear with the edit dots edit dots evenly spaced ou t around the perimeter of the stage. If you do not see the green and red dot s see Step 11. Otherwise, skip to Step 12. Step 11: Turn the edit mode on. Move the mouse over the stage window and right click. You should see the edit dots disappear an d you should see the trigger from the Stage Mouse Watcher trigger the Toggle, which triggers the Edit' input of the Polygon Mapper. Practice right clicking to see the edit mode switch on and off. Step 12: Move the mouse over the stage. As the mo use moves, the first vertex X and Y follows the mouse represented by the green dot The vertex dot will only follow the mouse if these three things are true: One: T he edit mode is on Two: The mouse is in the stage wi ndow. Three: The mouse first touches the top left corner where the dot is. In order for the dot to move with the mouse, the mouse must first touch the location of the dot being edited the red dot The dot then sticks' to the mouse until it is release by changing edit dots. Move the f irst edit dot into the desired position. Step 13 : Press the left arrow key to move to the next edit dot. The second vertex X and Y will now turn green The left and right arrow keys rotate through the vertices. All the dots are red unless they are b eing edited.

PAGE 100

! "## Step 14 : Repeat steps 11 and 12 until all the dots are mapped to the corners of the cube. (For practice just create a shape.) Note: As you map vertices you will notice the values on the Stage Mouse Watcher being outputted to the mouseX and mouseY' inputs of the Polygon Mapper. Those values will then be transferred to the appropriate values for the vertex being edited. Step 15: Right click on the stage window to turn off edit mode. All the dots will disappear and you r map is complete.

PAGE 101

! "#" Step 16 : Anti aliasing should only be added once the map is complete. There are eight levels of anti aliasing: 0 7. Choose the proper anti aliasing for your map. Note: Level 4 should suffice for most maps. Going any higher may create lag with your media. 4.5 Aligning Media For optimum control over the map add the FreeFrame Plugin actor PanSpinZoom between the media player and the Polygon Mapper.

PAGE 102

! "#$ S PIN ZOOM PAN 4.6 Adding Eff ects Several effects can be added to the media inside the polygon. These actors must be placed in between PanSpinZoom and Polygon Mapper. More than one effect can be added at the same time as well by linking multiple effects together and then connecting them as a whole in between PanSpinZoom and the Polygon Mapper. Most of the effects used in Isadora are FreeFrame Plugins.

PAGE 103

! "#$ Examples of other effects: Solarize Glow Kaleidoscope Burn TV Pixel Dots !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! !!!!!!!!!!!!!!! %&'()*'! + ,*)-.'!,)/-01!&)234'*5!(5&6*-73-)/8! 9 !"#$% $ !"#$%& :;!?)@-6>8! A !"#$%&' ?*5 B *5@5'&5!/)35& C D*55D*'E51!45F& -358 G %&'()*'1!?-63H*5!?@'>5*!I63)*!J5@7!K5&6*-73-)/8 L %&'()*'1!M)N-5!?@'>5*!I63)*!J5@7!K5&6*-73-)/8 O %&'()*'1!?*)P563)*!I63)*!J5@7!K5&6*-73-)/8 Q %&'()*'1!=5>F)'*(!R'36S5*!I63)*!J5@7!K5&6*-73-)/8 "# %&'()*'1!,)TT@5!I63)*!J5@7!K5&6*-73-)/8

PAGE 104

! "#$ //As of 10 23 // =========================================================================== // Isadora Demo Plugin 2003 Mark F. Coniglio. All rights reserved. // =========================================================================== // // IMPORTANT: This source code ("the software") is supplied to you in // consideration of your agreement to the following terms. If you do not // agree to the terms, do not inst all, use, modify or redistribute the // software. // // Mark Coniglio (dba TroikaTronix) grants you a personal, non exclusive // license to use, reproduce, modify this software with and to redistribute it, // with or without modifications, in source and/or binary form. Except as // expressly stated in this license, no other rights are granted, express // or implied, to you by TroikaTronix. // // This software is provided on an "AS IS" basis. TroikaTronix makes no // warranties, express or implied, including without limitation the implied // warranties of non infringement, merchantability, and fitness for a // particular purpose, regarding this software or its use and operation // alone or in combination with your products. // // In no event shall TroikaTronix be liable for any special, indirect, incidental, // or consequential damages arising in any way out of the use, reproduction, // modification and/or distribution of this software. // // =========================================================================== // // CUSTOMIZING THIS SOURCE CODE // To customize this file, search for the text ###. All of the places where // you will need to customize the file are marked with this pattern o f // characters. // // ABOUT IMAGE BUFFER MAPS: // // The ImageBufferMap structure, and its accompanying functions, // exists as a convenience to those writing video processing plugins. // // Basically, an image buffer contains an arbitrary number of inpu t and // output buffers (in the form of ImageBuffers). The ImageBufferMap code // will automatically create intermediary buffers if needed, so that the // size and depth of the source image buffers sent to your callback are // the same for all buffers. // // Typically, the ImageBufferMap is created in your CreateActor function, // and dispose in the DiposeActor function. // --------------------------------------------------------------------------------// INCLUDES // -------------------------------------------------------------------------------#include "IsadoraTypes.h" #include "IsadoraCallbacks.h" #include "ImageBufferUtil.h" #include "PluginDrawUtil.h" // STANDARD INCLUDES #include #include // --------------------------------------------------------------------------------// MacOS Specific // --------------------------------------------------------------------------------#if TARGET_OS_MAC #define EXPORT_ #endif APPENDIX B Polygon Mapper Source Code

PAGE 105

! "#$ // -------------------------------------------------------------------------------// Win32 Specific // --------------------------------------------------------------------------------#if TARGET_OS_WIN32 #include //added to windows line, will not complie in mac if in standard// #define EXPORT_ __declspec(dllexport) #ifdef __cplusplus extern "C" { #endif BOOL WINAPI DllMain ( HINSTANCE hInst, DWORD wDataSeg, LPVOID lpvReserved ); #ifdef __cplusplus } #endif BOOL WINAPI DllMain (HINSTANCE /* hInst */ ,DWORD wDataSeg, LPVOID /* lpvReserved */ ) { switch (wDataSeg) { case DLL_PROCESS_ATTACH: return 1 ; break ; case DLL_PROCESS_DETACH: break ; default : return 1 ; break ; } return 0 ; } #endif // --------------------------------------------------------------------------------// Exported Function Definitions // --------------------------------------------------------------------------------#ifdef __cplusplus // non command part if computer understands c++, the SDK will use C// extern "C" { #endif EXPORT_ void GetActorInfo( void inParam, ActorInfo outActorParams); // takes out part of c++ if computer // is not c++// #ifdef __cplusplus } #endif // must be included from here up// // --------------------------------------------------------------------------------// FORWARD DECLARTIONS //PROGRAM STARTS// TYPES and NAMES // staic is one way to achieve communication between multiple actors, making it available to all// // -------------------------------------------------------------------------------static void // void means there is no output// ReceiveMessage( IsadoraParameters ip, MessageMask inMessageMask, PortIndex inPortIndex1, const MsgData inDat a, UInt32 inLen, long inRefCon); // --------------------------------------------------------------------------------

PAGE 106

! "#$ // GLOBAL VARIABLES // --------------------------------------------------------------------------------// ### Declare global variables, common to all instantiations of this plugin here // Example: static int gMyGlobalVariable = 5; // --------------------------------------------------------------------------------// PluginInfo struct // -------------------------------------------------------------------------------// ### This structure neeeds to contain all variables used by your plugin. Memory for // this struct is allocated during the CreateActor function, and disposed during // the DisposeActor function, and is private to each copy of the plugin. // // If your plugin needs global data, declare them as static variables within this // file. Any static variable will be global to all instantiations of the plugin. typedef struct { ActorInfo mActorInfoPtr ; // our ActorI nfo Pointer set during create actor fn MessageReceiverRef mMessageReceiver; // pointer to our message receiver reference Boolean mNeedsDraw; // set true when the video output needs to be drawn ImageBufferMap mImageBufferMap; // used by most video plugins Boolean mBypass; Boolean mEditMode; //turns on/off edit dots// int mEditSize; //size of dot int mVerticesCount; // vertices count int mVerticesIndex; // current vertices index int mVertexCoordinateMin; int mVertexCoordinateMax; float mCurrentWidth; float mCurrentHeight; int mCurrentVertexIndex; int mLastKeyCode; float mCoordinates; // polygon vertices is added for float to define an empty array of unknown size sy ntax called pointer// int AlphaMask; // point in polygon alpha mask saved in here int AlphaMaskWidth; int AlphaMaskHeight; Boolean RecalulateMask; // trigger the need for a recalulation of the mask Boolean jus tStartedEditingX; Boolean justStartedEditingY; int anti_aliasing; //optimized anti aliasing Value valuesInit[ 128 ]; Value valuesMin[ 128 ]; Value valuesMax[ 128 ]; } PluginInfo; static void addInputProperty( IsadoraParameters ip, ActorInfo inActorInfo, PluginInfo info, const char nameTemplate, const int index ); static void removeInputProperty( IsadoraParameters ip,

PAGE 107

! "#$ ActorInfo inActorInfo, const PluginInfo info ); // A handy macro for casting the mActorDataPtr to PluginInfo* #if __cplusplus #define GetPluginInfo_(actorDataPtr) static_cast((actorDataPtr) >mActorDataPtr); #else #define GetPluginInfo_(actorDataPtr) (PluginInfo*)((actorDataPtr) >mActorDataPtr ); #endif // --------------------------------------------------------------------------------// Constants // --------------------------------------------------------------------------------// Defines various constants used throughout the plugin. // ### GROUP ID // Define the group under which this plugin will be displayed in the Isadora interface. // These are defined under "Actor Types" in IsadoraTypes.h static const OSType kActorClass = kGroupVideo ; // ### PLUGIN IN // Define the plugin's unique four character identifier. Contact TroikaTronix to // obtain a unique four character code if you want to ensure that someone else // has not developed a plugin with the same code. Note that TroikaTronix reserves // all plugin codes that begin with an unde rline, an at sign, and a pound sign // (e.g., '_', '@', and '#'.) static const OSType kActorID = FOUR_CHAR_CODE ( 'PR12' ); // ### ACTOR NAME // The name of the actor. This is the name that will be shown in the User Interface. static const char kActorName = "Polygon Mapper: Dynamic" ; // ### PROPERTY DEFINITION STRING // The property string. This string determines the inputs and outputs for your plugin. // See the IsadoraCallbacks.h under the heading "PROPERTY DEFINITION STRING" for the // meaning ofthese codes. (The IsadoraCallbacks.h header can be seen by opening up // the IzzySDK Framework while in the Files view.) // // IMPORTANT: You cannot use spaces in the property name. Instead, use underscores (_) // where you want to have a space. // // Note that each line ends with a carriage return ( \ r), and that only the last line of // the bunch ends with a semicolon. This means that what you see below is one long // null terminated c string, with the individual lines separated by carriage retur ns. static const char sPropertyDefinitionString = // INPUT PROPERTY DEFINITIONS // TYPE PROPERTY NAME ID DATATYPE DISPLAY FMT MIN MAX INIT VALUE "INPROP video_in vin data video 0 \ r" "INPROP mouseX mosX float number 0 \ r" "INPROP mouseY mosY float number 0 \ r" "INPROP keyCode keyC int number 0 \ r" "INPROP antiAliasing anti int number 0 7 0 \ r" "INPROP fValue fVal float number 0 10240 1 0 1 \ r" "INPROP edit edit bool onoff 0 1 1 \ r" "INPROP dotsize dot int number 0 5 0 \ r" "INPROP verticesCount vcnt int number,mutable1 0 999 0 1 \ r" // OUTPUT PROPERTY DEFINITIONS // TYPE PROPERTY NAME ID DATATYPE DISPLAY FMT MIN MAX INIT VALUE "OUTPROP video_out vout data video 0 \ r" ; // ### Property Index Constants

PAGE 108

! "#$ // Properties are referenced by a one based index. The first input property will // be 1, the second 2, etc. Similarly, the first output property starts at 1. // You whould have one constant for each input and output property defined in the // property definition string. enum //just labeling the input/output// { kInputVideoIn = 1 kInputMouseX kInputMouseY kInputKeyCode kAntiAliasing kInputFValue kInputEditMode kInputEditSize kInputVerticesCount kOutputVideo = 1 }; // --------------------// Help String // --------------------// ### Help Strings // // The first help string is for the actor in general. This followed by help strings // for all of the inputs, and then by the help strings for all of the outputs. These // should be given in the order that they are defined in the Property Definition // String above. // // In all, the total number of help strings should be (num inputs + num outputs + 1) // // Note that each string is followed by a comma -it is a common m istake to forget the // comma which results in the two strings being concatenated into one. const char sHelpStrings [] = { "Polygon Mapper allows for vertices in a polygon to be mapped and filled with content." "This version of the actor allows for 6 points at a resolution of 640x480. For additional points and different resolutions, new versions of the actor will be programmed." "The video source that will fill the map. It can be a video, picture, shape, etc ." "Mouse X position" "Mouse Y position" "Anti Aliasing Values" "fValue" "Edit: turns the edit dots on and off" "Edit Size: changes the value of the edit dots from 0 pixels 5 pixels" "Vertices count" "The mapped video output." }; // --------------------------------------------------------------------------------// CreateActor // standard create and destroy an actor// // --------------------------------------------------------------------------------// Called once, prior to the first activation of an actor in its Scene. The // corresponding DisposeActor actor function will not be called until the file // owning this actor is closed, or the actor is destroyed as a result of being // cut or deleted. static void CreateActor( IsadoraParameters ip, ActorInf o ioActorInfo) // pointer to this actor's ActorInfo struct unique t o each instance of an actor { // creat the PluginInfo struct initializing it to all zeroes PluginInfo info = ( PluginInfo *) IzzyMallocClear_ (ip, sizeof ( PluginInfo ));

PAGE 109

! "#$ info > mCoordinates = ( float *) IzzyMallocClear_ (ip, sizeof ( float ) 999 2 ); PluginAssert_ (ip, info != nil ); ioActorInfo > mActorDataPtr = info; info > mActorInfoPtr = ioActorInfo; // ### allocation and initialization of private member variables // set number of input and output buffers in our buffer map // and then initialize it info > mImageBufferMap mInputBufferCount = 1 ; info > mImageBufferMap mOutputBufferCount = 1 ; info > mVertexCoordinateMin = 0 ; info > mVertexCoordinateMax = 10240 ; info > AlphaMaskWidth = 1 ; info > AlphaMaskHeight = 1 ; info > AlphaMask = NULL ; info > RecalulateMask = true ; CreateImageBufferMap (ip, &info > mImageBufferMap ); } // --------------------------------------------------------------------------------// DisposeActor // --------------------------------------------------------------------------------// Called when the file owning this actor is closed, or when the actor is destroyed // as a result of its being cut or deleted. // static void DisposeActor( IsadoraParameters ip, ActorInfo ioActorInfo) // pointer to this actor's ActorInfo struct unique to each instance of an actor { PluginInfo info = GetPluginInfo_ (ioActorInfo); PluginAssert_ (ip, info != nil ); // ### destruction of private member variables // destroy our image buffer map DisposeImageBufferMap (ip, &info > mImageBufferMap ); // destroy the PluginInfo struct allocated with IzzyMa llocClear_ the CreateActor function PluginAssert_ (ip, ioActorInfo > mActorDataPtr != nil ); IzzyFree_ (ip, info > mCoordinates ); IzzyFree_ (ip, ioActorInfo > mActorDataPtr ); } // -------------------------------------------------------------------------------// • CreatePropertyID [INTERRUPT SAFE] // --------------------------------------------------------------------------------inline OSType CreatePropertyID( IsadoraParameters ip, const char inRateBase, SInt32 inIndex) { const SInt32 kOneCharMax = 26 ; const SInt32 kTwoCharMax = kOneCharMax kOneCharMax; PluginAssert_ (ip, inRateBase[ 0 ] != 0 && inRateBase[ 1 ] != 0 ); PluginAssert_ (ip, inIndex >= 0 && inIndex < kTwoCharMax 2 ); OSType result = ((( UInt32 ) inRateBase[ 0 ]) << 24 ) | ((( UInt32 ) inRateBase[ 1 ]) << 16 );

PAGE 110

! ""# SInt32 indexLS; SInt32 indexMS; SInt32 indexOffset; // in index is between 00 and 99 if (inIndex >= 0 && inIndex < 100 ) { indexMS = inIndex / 10 ; indexLS = inIndex % 10 ; result |= ( ((( UInt32 ) (indexMS + '0' )) << 8 ) | ((( UInt32 ) (indexLS + '0' )) << 0 ) ); // if between 100 and 776 } else if (inIndex >= 100 && inIndex < 100 + kTwoCharMax) { indexOffset = inIndex 100 ; PluginAssert_ (ip, indexOffset >= 0 && indexOffset < kTwoCharMax) ; indexMS = indexOffset / kOneCharMax; indexLS = indexOffset % kOneCharMax; result |= ( ((( UInt32 ) (indexMS + 'A' )) << 8 ) | ((( UInt32 ) (indexLS + 'A' )) << 0 ) ); // if between 776 and 1452 } else if (inIndex >= 100 + kTwoCharMax && inIndex < 100 + kTwoCharMax 2 ) { indexOffset = inIndex ( 100 + kTwoCharMax); PluginAssert_ (ip, indexOffset >= 0 && indexOffset < kTwoCharMax); indexMS = indexOffset / kOneCharMax; indexLS = indexOffset % kOneCharMax; result |= ( ((( UInt32 ) (indexM S + 'a' )) << 8 ) | ((( UInt32 ) (indexLS + 'a' )) << 0 ) ); } else { PluginAssert_ (ip, false ); } return result; } // --------------------------------------------------------------------------------// ActivateActor // -------------------------------------------------------------------------------// Called when the scene that owns this actor is activated or deactivated. The // inActivate flag will be true when the scene is activated, false when deactivated. // static void ActivateActor( IsadoraParameters ip, ActorInfo inActorInfo, // pointer to this actor's ActorInfo struct unique to each instance of an actor Boolean inActivate) // true when actor is becoming active, else false { PluginInfo info = GetPluginInfo_ (inActorInfo); // -----------------------// ACTIVATE // -----------------------if (inActivate) { // Isadora passes various messages to plugins that request them. // These include Mouse Moved messages, Key Down/Key Up messages, // Video Frame Clock messages, etc. The complete list can be found // in the enumeration in MessageReceiverCommon.h // You ask Isadora¨ for these messages by calling CreateMessageReceiver_ // with a pointer to your function, and the me ssage types you would // like to receive. (These are bitmapped flags, so you can combine as

PAGE 111

! """ // many as you like: kWantKeyDown | kWantKeyDown for instance.) // Here we request that our ReceiveMessage function is called // whenever the Isadora New Video Frame message is sent, // which happens periodically, 30 times per second. We set the ref // con to our ActorInfo ptr so that we can access that information // from ReceiveMessage callback. MessageReceiveFunction* msgRcvFunc = ReceiveMessage; // if the "bypass" flag is off, then we want to receive messages // if (info >mBypass == false) { // we should not already have a message receiver PluginAssert_ (ip, info > mMessageReceiver == nil ); // create a message rece iver that will be notified of // video frame ticks info > mMessageReceiver = CreateMessageReceiver_ ( ip, msgRcvFunc, 0 kWantVideoFrameTick ( long ) inActorInfo); // } // set the needs draw flag so that we will be drawn as soon // as possible info > mNeedsDraw = true ; // -----------------------// DEACTIVATE // -----------------------} else { // dispose our message receiver when we are deactivated. if (info > mMessageReceiver != nil ) { DisposeMessageReceiver_ (ip, info > mMessageReceiver ); info > mMessageReceiver = nil ; info > mNeedsDraw |= true ; } // ### dispose any data that you don't need when // you are not active. DisposeOwnedImageBuffers (ip, &info > mImageBufferMap ); ClearSourceBuffers (ip, &info > mImageBufferMap ); } } // --------------------------------------------------------------------------------// GetParameterString // -------------------------------------------------------------------------------// Returns the property definition string. Called when an instance of the actor // needs to be instantiated. static const char GetParameterString( IsadoraParameters /* ip */ ActorInfo /* inActorInfo */ ) { retu rn sPropertyDefinitionString ; } // --------------------------------------------------------------------------------// GetHelpString // --------------------------------------------------------------------------------// Returns the help string for a particular property. If you have a fixed number of // input and output properties, it is best to use the PropertyTypeAndIndexToHelpIndex_

PAGE 112

! ""# // function to determine the correct help string to return. static void GetHelpString( IsadoraParameters ip, ActorInfo inActorInfo, PropertyType inPropertyType, // kPropertyTypeInvalid w hen requesting help for the acto r P ropertyIndex inPropertyIndex1, // the one based index of the property (when inPropertyType is not kPropertyTypeInvalid) char outParamaterString, // receives the help string UInt32 inMaxCharacters) // size of the outParamaterString buffer { const char helpstr = nil ; // The PropertyTypeAndIndexToHelpIndex_ converts the inPropertyType and // inPropertyIndex1 parameters to determine the zero based index into // your list of help strings. if ((inPropertyType == kInputProperty ) && (inPropertyIndex1 > kInputVerticesCount )) { // copy it to the output string strncpy (outParamaterString, "Vertex coortinates" inMaxCharacters); } else { UInt32 index1 = PropertyTypeAndIndexToHelpIndex_ (ip, inActorInfo, inPropertyType, inPropertyIndex1); // get the help string helpstr = sHelpStrings [index1]; // copy it to the output string strncpy (outParamaterString, helpstr, inMaxCharacters); } } void setInitialCoordinates( float points, int numofpoints) { int m=(numofpoints)%( 4 ); int n=(numofpoints m)/ 4 ; int i; int l; int c= 0 ; //upper if (m> 0 ) l=n+ 1 ; else l=n; for (i= 0 ;i 1 ) l=n+ 1 ; else l=n; for (i= 0 ;i 2 ) l=n+ 1 ; else l=n; for (i= 0 ;i
PAGE 113

! ""# //left if (m> 3 ) l=n+ 1 ; else l=n; for (i= 0 ;icurrent_point;i -) { points[(i+numofnewpoints numofpoints)* 2 + 1 ]=points[i* 2 + 1 ]; points[(i+numofnewpoints numofpoints)* 2 + 0 ]=points[i* 2 + 0 ]; } //new points for (i= 0 ;i<(numofnewpoints numofpoints);i++) { points[(current_point+ 1 +i)* 2 + 0 ]=points[current_point* 2 + 0 ]+dx*(i+ 1.0 )/(numofnewpoints numofpoints+ 1.0 ); points[(current_point+ 1 +i )* 2 + 1 ]=points[current_point* 2 + 1 ]+dy*(i+ 1.0 )/(numofnewpoints numofpoints+ 1.0 ); } } // --------------------------------------------------------------------------------// HandlePropertyChangeValue [INTERRUPT SAFE] // -------------------------------------------------------------------------------// ### This function is called whenever one of the input values of an actor changes. // The one based property index of the input is given by inPropertyIndex1. // The new value is given by inNewValue, the pre vious value by inOldValue. // static void HandlePropertyChangeValue( IsadoraParameters ip, ActorInfo inActorInfo, PropertyIndex inPropertyIndex1, // the one based index of the property than // changed values ValuePtr /* inOldValue */ // the property's old value ValuePtr inNewValue, // the property's new value Boolean inInitializing) // true if the value is being set when an // actor is first initalized { PluginInfo info = GetPluginInfo_ (inActorInfo); // ### When you add/change/remove properties, you will need to add cases // to this switch statement, to process the messages for your // input properties // The value comes to you encapsulated in a Value structure. See // ValueCommon.h for details about the contents of this structure.

PAGE 114

! ""# switch (inPropertyIndex1) { case kInputVideoIn : { // if bypass is off, then we store the incoming video frame reference // into our image buffer -it will be processed when our ReceiveMessage // function next receives a kWantVideoFrameTick message //if (info >mBypass == false) { SetImageBufferValue (ip, &info > mImageBufferMap 0 GetDataValueOfType (inNe wValue, k ImageBufferDataType ImageBufferPtr )); // set mNeedsDraw flag to ensure that new video image is drawn info > mNeedsDraw |= true ; // if bypass is on, we simply send the incoming video frame reference // on to the output -bypassing all processing entirely. } break ; case kInputEditMode : { info > justStartedEditingX = false ; info > justStartedEditingY = false ; info > mEditMode = inNewValue > u ivalue ; info > mNeedsDraw = true ;} break ; case kInputEditSize : {info > mEditSize = inNewValue > u ivalue ; info > mNeedsDraw = true ;} break ; case kInputKeyCode : if (inNewValue > u ivalue == 0 ) break ; if (inNewValue > u ivalue == 28 ) { info > justStartedEditingX = false ; info > justStartedEditingY = false ; info > mCurrentVertexIndex -; if (info > mCurrentVertexIndex < 0 ) info > mCurrentVertexIndex = (info > mVerticesCount / 2 ) 1 ; } else if (inNewValue > u ivalue == 29 ) { info > justStartedEditingX = false ; info > justStartedEditingY = false ; info > mCurrentVertexIndex ++; if (info > mCurrentVertexIndex >= (info > mVerticesCount / 2 )) info > mCurrentVertexIndex = 0 ; } if (info > mCurrentVertexIndex < 0 ) info > mCurrentVertexIndex = 0 ; Value v; v. type = kInteger ; v. u ivalue = 0 ; SetInputPropertyValue_ (ip, inActorInfo, inPropertyIndex1, &v); info > mNeedsDraw |= true ; break ; case kInputMouseX : if (info > mEditMode ) { float value = inNewValue > u fvalue

PAGE 115

! ""# Value v; v. type = kFloat ; v. u fvalue = value; if ((info > justStartedEditingX == true )&&(info > justStartedEditingY == true )){ info > mCoordinates [(info > mCurrentVertexIndex 2 ) + 0 ] = value; int index = ( kInputVerticesCount + (info > mCurrentVertexIndex 2 ) + 0 + 1 ); SetInputPropertyValue_ (ip, inActorInfo, index, &v); } else if ( abs (value info > mCoordinates [(info > mCurrentVertexIndex 2 ) + 0 ])< 5 ) info > justStartedEditingX = true ; info > mNeedsDraw |= true ; info > RecalulateMask = true ; } break ; case kInputMouseY : if (info > mEditMode ) { float value = inNewValue > u fvalue Value v; v. type = kFloat ; v. u fvalue = value; if ((info > justStartedEditingX == true )&&(info > justStartedEditingY == true )){ info > mCoordinates [(info > mCurrentVertexIndex 2 ) + 1 ] = value; int index = ( kInputVerticesCount + (info > mCurrentVertexIndex 2 ) + 1 + 1 ); SetInputPropertyValue_ (ip, inActorInfo, index, &v); } else if ( abs (value info > mCoordinates [(info > mCurrentVertexIndex 2 ) + 1 ])< 5 ) info > justStartedEditingY = true ; info > mNeedsDraw |= true ; info > RecalulateMask = true ; } break ; case kAntiAliasing : {info > anti_aliasing = inNewValue > u ivalue ; info > mNeedsDraw = true ;info > RecalulateMask = true ;} break ; case kInputBypass: // store member variable for on/off info > mBypass = (inNewValue > u ivalue != 0 ); // if "bypass" is going from on to off, we need to // reallocate our message receiver so that we will // start receiving video frame tick messages again if (info > mBypass == false ) { if (info > mMessageReceiver == nil ) { info > mMessageReceiver = CreateActorMessageReceiver_ ( ip, inActorInfo, ReceiveMessage, 0 kWantVideoFrameTick

PAGE 116

! ""# ( long ) inActorInfo); } // if "bypass" is g oing from off to on, then we want to // stop processing video. We dispose our message receiver // here to save processing power -when bypass is "on" the // incoming video is sent directly to the output -see the // case kInputVideoIn abo ve. } else { if (info > mMessageReceiver != nil ) { DisposeMessageReceiver_ (ip, info > mMessageReceiver ); info > mMessageReceiver = nil ; info > mNeedsDraw |= true ; } } break ; case kInputVerticesCount : int newCount = inNewValue > u ivalue ; newCount *= 2 ; if (newCount == info > mVerticesCount ) break ; if (info > mVerticesCount >= 3 && newCount>info >mVericesCount) { setNewCoordinates(info > mCoordinates ,info > mCurrentVertexIndex ,info >mVericesCount/ 2 ,newCount/ 2 ); } else if (info > mVerticesCount < 3 && newCount> 0 ) { setInitialCoordinates (info > mCoordinates ,newCount/ 2 ); } if (inInitializing) { info > mVerticesCount = newCount; break ; } if (newCount > info > mVerticesCount ) { while (info > mVerticesCount < newCount) { Value v; v. type = kFloat ; addInputProperty ( ip, inActorInfo, info, "vertex %d X" info > mVerticesCount ); v. u fvalue = info > mCoordinates [info > mVerticesCount ]; SetInputPropertyValue_ (ip, inActorInfo, ( kInputVerticesCount + (info > mVerticesCount ) + 0 + 1 ), &v); info > mVerticesCount ++; addInputProperty ( ip, i nActorInfo, info, "vertex %d Y" info > mVerticesCount ); v. u fvalue = info > mCoordinates [info > mVerticesCount ]; SetInputPropertyValue_ (ip, inActorInfo, ( kInputVerticesCount + (info > mVerticesCount ) + 0 + 1 ), &v);

PAGE 117

! ""# info > mVerticesCount ++; } } else { while (info > mVerticesCount > newCount) { removeInputProperty ( ip, inActorInfo, info ); info > mVerticesCount -; } } break ; } if (inPropertyIndex1 > kInputVerticesCount ) { int index = inPropertyIndex1 ( kInputVerticesCount + 1 ); info > mCoordinates [index] = ( float ) (inNewValue > u fvalue ); info > mNeedsDraw = true ;info > RecalulateMask = true ; } } static void removeInputProperty( IsadoraParameters ip, ActorInfo inActorInfo, const PluginInfo info ) { IzzyError err = RemovePropertyProc_ ( ip, inActorInfo, kInputProperty kInputVerticesCount + 0 + info > mVerticesCount ); } static void addInputProperty( IsadoraParameters ip, A c torInfo inActorInfo, PluginInfo info, const char nameTemplate, const int index1 ) { //Value valueInit; info > valuesInit [info > mVerticesCount ]. type = kFloat ; info > valuesInit [info > mVerticesCount ]. u fvalue = 0 ; PropertyDispFormat availFmts; PropertyDispFormat curFmt; if ( GetPropertyDisplayFormats_ (ip, inActorInfo, kInputProperty kInputFValue &availFmts, & curFmt) != noErr ) { } int index = kInputVerticesCount + 1 + info > mVerticesCount ; OSType rateType = CreatePropertyID (ip, "it" index); char propertyName[ 128 ]; sprintf (propertyName, nameTemplate, info > mVerticesCount / 2 );

PAGE 118

! ""# //Value valueMin; //Value valueMax; info > valuesMin [info > mVerticesCount ]. type = kFloat ; info > valuesMax [info > mVerticesCount ]. type = kFloat ; info > valuesMin [info > mVerticesCount ]. u fvalue = info > mVertexCoordinateMin ; info > valuesMax [info > mVerticesCount ]. u fvalue = info > mVertexCoordinateMax ; IzzyError err = AddProperty_ ( ip, inActorInfo, kInputProperty rateType, // the input type FOUR_CHAR_CODE ( 'fVal' ), // the input to which we will conform propertyName, availFmts, curFmt, 1 &(info > valuesMin [info > mVerticesCount ]), &(info > valuesMax [info > mVerticesCount ]), &(info > valuesInit [info > mVerticesCount ]) ); if (err != noErr ) { } PluginAssert_ (ip, err == noErr ); } // --------------------------------------------------------------------------------// GetActorDefinedArea // --------------------------------------------------------------------------------// If the mGetActorDefinedAreaProc in the ActorInfo struct points to this function, // it indicates to Isadora that the object would like to draw either an icon or else // an graphic representation of its function. // // ### This function uses the 'PICT' 0 resource stored with the plugin to draw a n icon. // You should replace this picture (located in the Plugin Resources.rsrc file) with // the icon for your actor. // static ActorPictInfo gPictInfo = { false nil nil 0 0 }; static Boolean GetActorDefinedArea( IsadoraParameters ip, ActorInfo inActorInfo, SInt16 outTopAreaWidth, // returns the width to reserve for the top Actor Defined Area SInt16 outTopAreaMinHeight, // returns the minimum height of the top area SInt16 outBotAreaHeight, // returns the wd to reserve for bottom Actor Defined Area SInt16 outBotAr eaMinWidth) // returns the minimum width of the bottom area { if (! gPictInfo mInitialized ) { PrepareActorDefinedAreaPict_ (ip, inActorInfo, 0 & gPictInfo ); } // place picture in top area *outTopAreaWidth = gPictInfo mWidth ; *outTopAreaMinHeight = gPictInfo mHeight ; // don't draw anything in bottom area *outBotAreaHeight = 0 ; *outBotAreaMinWidth = 0 ; return true ; }

PAGE 119

! ""# // --------------------------------------------------------------------------------// DrawActorDefinedArea // --------------------------------------------------------------------------------// If GetActorDefinedArea is defined, then this function will be called whenever // your ActorDefinedArea needs to be drawn. // // Beacuse we are using the PICT 0 resource stored with this plugin, we can use // the DrawActorDefinedAreaPict_ supplied by the Isadora callbacks. // // DrawActorDefinedAreaPict_ is Alpha Channel aware, so you can have nice // shading if you like. static void DrawActorDefinedArea( IsadoraParameters ip, ActorInfo inActorInfo, void /* inDrawingContext */ // unused at present ActorDefinedAreaPart inActorDefinedAreaPart, // the part of the actor that needs to be drawn ActorAreaDrawFlagsT /* inAreaDrawFlags */ // actor draw flags Rect inADAArea, // rect enclosing the entire Actor Defined Area Rect /* inUpdateArea */ / / subset of inADAArea that needs updat ing Boolean inSelected) // TRUE if actor is currently selected { if (inActorDefinedAreaPart == kActorDefinedAreaTop && gPictInfo mInitialized ) { DrawActorDefinedAreaPict_ (ip, inActorInfo, inSelected, inADAArea, & gPictInfo ); } } // --------------------------------------------------------------------------------// GetActorInfo // --------------------------------------------------------------------------------// This is function is called by to get the actor's class and ID, and to get // pointers to the all of the plugin functions declared locally. // // All members of the ActorInfo struct pointed to by outActorParams have been // set to 0 on entry. You only need set functions defined by your plugin // EXPORT_ void GetActorInfo( void /* inParam */ ActorInfo outActorParams) { // REQUIRED information outActorParams > mActorName = kActorName ; outActorParams > mClass = kActorClass ; outActorParams > mID = kActorID ; outActorParams > mCompatibleWithVersion = kCurrentIsadoraCallbackVersion ; // REQUIRED functions outActorParams > mGetActorParameterStringProc = GetParameterString; outActorParams > mGetActorHelpStringProc = GetHelpString; outActorParams > mCreateActorProc = CreateActor; outActorParams > mDisposeActorProc = DisposeActor; outActorParams > mActivateActorProc = ActivateActor; outActorParams > mHandlePropertyChangeValueProc = HandlePropertyChangeValue; // OPTIONAL FUNCTIONS outActorParams > mHandlePropertyChangeTypeProc = NULL ; outAc torParams > mHandlePropertyConnectProc = NULL ; outActorParams > mPropertyValueToStringProc = NULL ; outActorParams > mPropertyStringToValueProc = NULL ; outActorParams > mGetActorDefinedAreaProc = GetActorDefinedArea; outActorParams > mDrawActorDefinedAre aProc = DrawActorDefinedArea; outActorParams > mMouseTrackInActorDefinedAreaProc = NULL ; } // --------------------------------------------------------------------------------// ProcessVideoFrame // --------------------------------------------------------------------------------

PAGE 120

! "#$ // ### This is the code that does the actual processing of a video frame. Modify // this code to create your own filter. // int pointInPolygon( float coordinates, int polyS ides, float xx, float yy, int width, int height) { int i, j = polySides 1 ; //delete to occlude polygon// bool oddNodes = false ; float x=xx* 100.0 /width; float y=yy* 100.0 /height; for (i= 0 ; i < polySides; i++) { if ( (coordinates[(i 2 ) + 1 ] < y && coordinates[(j 2 ) + 1 ] >= y) || (coordinates[(j 2 ) + 1 ] < y && coordinates[(i 2 ) + 1 ] >= y) ) { if ( ( coordinates[(i 2 ) + 0 ] + (y coordinates[(i 2 ) + 1 ]) / (coordinates[(j 2 ) + 1 ] coordinates[(i 2 ) + 1 ]) (coordinates[(j 2 ) + 0 ] coordinates[(i 2 ) + 0 ]) ) < x ) { oddNodes=!oddNodes; } } j=i; } if (oddNodes== false ) return 0 ; else return 1 ; } void MaskRecalculation( IsadoraParameters ip, PluginInfo info, int width, int height) { if (info > AlphaMask != NULL ) IzzyFree_ (ip, info > AlphaMask ); info > AlphaMask = ( int *) IzzyMallocClear_ (ip, sizeof ( int ) width height); int polySize = info > mVerticesCount / 2 ; // for each row SInt16 row = 0 ; while (row < height) { // and for each column in that row SInt16 col = 0 ; while (col < width) { int sum= 0 ; int num_of_tests= 1 ; if (info > anti_aliasing == 0 ) { sum= pointInPolygon (info > mCoordinates polySize, col, row,width,height); num_of_tests= 1 ; } else if (info > anti_aliasing == 1 ) { sum= pointInPolygon (info > mCoordinates polySize, col+ 0.246490 row+ 0.249999 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.246490 row 0.249999 ,width,height); num_of_tests= 2 ; }

PAGE 121

! "#" else if (info > anti_aliasing == 2 ) { sum= pointInPolygon (info > mCoordinates polySize, col 0.373411 row 0.250550 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.256263 row+ 0.368119 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.117148 row 0.117570 ,width,height); num_of_tests= 3 ; } else if (info > anti_aliasing == 3 ) { sum= pointInPolygon (info > mCoordinates polySize, col 0.208147 row+ 0.353730 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.203849 row 0.353780 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.292626 row 0.149945 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.296924 row+ 0.149994 ,width,height); num_of_tests= 4 ; } else if (info > anti_aliasing == 4 ) { sum= pointInPolygon (info > mCoordinates polySize, col 0.334818 row+ 0.435331 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.286438 row 0.393495 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.459462 row+ 0.141540 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.414498 row 0.192829 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.183790 row+ 0.299133 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.079263 row 0.317383 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.102254 row+ 0.353730 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.164216 row 0.054399 ,width,height); num_of_tests= 8 ; } else if (info > anti_aliasing == 5 ) { sum= pointInPolygon (info > mCoordinates polySize, col+ 0.285561 row+ 0.188437 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.360176 row 0.065688 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.111751 row+ 0.275019 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.055918 row 0.215197 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.080231 row 0.470965 ,width,height) + pointInPolygon (info > mCoordinates polyS ize, col+ 0.138721 row+ 0.409168 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.384120 row+ 0.458500 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.454968 row+ 0.134088 ,width,height) + pointInPolygon (in fo > mCoordinates polySize, col+ 0.179271 row 0.331196 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.307049 row 0.364927 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.105354 row 0.010099 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.154180 row+ 0.021794 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.370135 row 0.116425 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.451636 row 0.300013 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.370610 row+ 0.387504 ,width,height); num_of_tests= 15 ; } else if (info > anti_aliasing == 6 ) { sum=

PAGE 122

! "## pointInPolygon (info > mCoordinates polySize, col+ 0.030245 row+ 0.136384 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.018865 row 0.348867 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.350114 row 0.472309 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.222181 row+ 0.149524 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.393670 row 0.266873 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.404568 row+ 0.230436 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.098381 row+ 0.465337 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.462671 row+ 0.442116 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.400373 row 0.212720 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.409988 row+ 0.263345 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.115878 row 0.001981 ,width,height) + pointInPolygon (info > mCoordinates pol ySize, col+ 0.348425 row 0.009237 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.464016 row+ 0.066467 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.138674 row 0.468006 ,width,height) + pointInPolygon ( info > mCoordinates polySize, col+ 0.144932 row 0.022780 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.250195 row+ 0.150161 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.181400 row 0.264219 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.196097 row 0.234139 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.311082 row 0.078815 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.268379 row+ 0. 366778 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.040601 row+ 0.327109 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.234392 row+ 0.354659 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.003102 row 0.154402 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.297997 row 0.417965 ,width,height); num_of_tests= 24 ; } else if (info > anti_aliasing == 7 ) { sum= pointInPolygon (info > mCoordinates polySize, col+ 0.266377 row 0.218171 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.170919 row 0.429368 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.047356 row 0.387135 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.430063 row+ 0.363413 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.221638 row 0.313768 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.124758 row 0.197109 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.400021 row+ 0.482195 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.247882 row+ 0.152010 ,width,height) + pointInPolygon (info > mCoordinates pol ySize, col 0.286709 row 0.470214 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.426790 row+ 0.004977 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.361249 row 0.104549 ,width,height) + pointInPolygo n (info > mCoordinates polySize, col 0.040643 row+ 0.123453 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.189296 row+ 0.438963 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.453521 row 0.299889 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.408216 row 0.457699 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.328973 row 0.101914 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.055540 row 0.477952 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.194421 row+ 0.453510 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.404051 row+ 0.224974 ,width,height) + pointInPolygon (info > mCoordi nates polySize, col+ 0.310136 row+ 0.419700 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.021743 row+ 0.403898 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.466210 row+ 0.248839 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.341369 row+ 0.081490 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.124156 row 0.016859 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.461321 row 0.176661 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.013210 row+ 0.234401 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.174258 row 0.311854 ,width,height) + pointInPolygon (info > mCoordinates po lySize, col+ 0.294061 row+ 0.263364 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.114836 row+ 0.328189 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.041206 row 0.106205 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.079227 row+ 0.345021 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.109319 row 0.242380 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.425005 row 0.3323 97 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.009146 row+ 0.015098 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.339084 row 0.355707 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.224596 row 0.189548 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.083475 row+ 0.117028 ,width,height) +

PAGE 123

! "#$ pointInPolygon (info > mCoordinates polySize, col+ 0.295962 row 0.334699 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.452998 row+ 0.025397 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.206511 row 0.104668 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.447544 row 0.096004 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.108006 row 0.002471 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.380810 row+ 0.130036 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.242440 row+ 0.186934 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.200363 row+ 0.070863 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.344844 row 0.230814 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.408660 row+ 0.345826 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.233016 row+ 0.305203 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.158475 row 0.430762 ,width,height) + pointInPolygon (info > mCoo rdinates polySize, col+ 0.486972 row+ 0.139163 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.301610 row+ 0.009319 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.282245 row 0.458671 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.482046 row+ 0.443890 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.121527 row+ 0.210223 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.477606 row 0.42487 8 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.083941 row 0.121440 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.345773 row+ 0.253779 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.234646 row+ 0.034549 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.394102 row 0.210901 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.312571 row+ 0.397656 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.200906 row+ 0.333293 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.018703 row 0.261792 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.209349 row 0.065383 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.076248 row+ 0.478538 ,width,height) + pointInPolygon (info > mCoordinates polySize, col 0.073036 row 0.355064 ,width,height) + pointInPolygon (info > mCoordinates polySize, col+ 0.145087 row+ 0.221726 width,height); num_of_tests= 66 ; } info > AlphaMask [row*width+col]=( 255 *sum)/num_of_tests; col++; } row++; } info > RecalulateMask = false ; } static void ProcessVideoFrame( IsadoraParameters ip // not used in this function, but needed to call PluginAssert_ PluginInfo info, ImageBufferPtr srcBuf, ImageBufferPtr outBuf) { UInt32 srcData = static_cast < UInt32 *>(srcBuf > mBaseAddress ); UInt32 srcStride = srcBuf > mRowBytes srcBuf > m Width sizeof ( UInt32 ); UInt32 outData = static_cast < UInt32 *>(outBuf > mBaseAddress ); UInt32 outStride = outBuf > mRowBytes outBuf > mWidth sizeof ( UInt32 ); info > mCurrentWidth = outBuf > mWidth ; info > mCurrentHeight = outBuf > mHeight ; // ### this is where your video processing code would go // here, we are increasing or decreasing the red, green, // and blue components of the video image.

PAGE 124

! "#$ /* SInt16 redFactor = static_cast(256.0 info >mRedAmount); SInt16 greenFactor = static_ca st(256.0 info >mGreenAmount); SInt16 blueFactor = static_cast(256.0 info >mBlueAmount); */ // int polySize = 6; //this allows for all 6 verticies// int polySize = info > mVerticesCount / 2 ; if ((polySize <= 0 ) || (polySize < 3 )) return ; if (info > RecalulateMask == true || outBuf > mWidth !=info > AlphaMaskWidth || outBuf > mHeight !=info > AlphaMaskHeight ) MaskRecalculation (ip,info,outBuf > mWidth ,outBuf > mHeight ); // for each row SInt16 row = 0 ; while (row < outBuf > mHeight ) { // and for each column in that row SInt16 col = 0 ; while (col < outBuf > mWidth ) { // IMRORTANT: For Mac/Windows cross platform compatbility, // make sure to use the RED_, GREEN_ and BLUE_ macros to // extract pixels from a the raw data, and use RGB_ to // combine them back. // // On the Mac, the pixels are arranged 00RRGGBB with // the blue in the low 8 bits. // Under Quicktime for Windows, the data is arranged // BBGGRR00 in reverse order. Using the mac ros is // a easy way to ensure your code will operate on // both platforms. // SInt16 red = RED_ (*srcData); SInt16 green = GREEN_ (*srcData); SInt16 blue = BLUE_ (*srcData); if (info > AlphaMask != NULL ) *(outData) = ARGB_ (info > AlphaMask [row*outBuf > mWidth +col], red, green, blue); else *(outData) = ARGB_ ( 0 red, green, blue); // increment src and out pixel srcData++; outData++; // increment column count col++; } // skip to the next line of video, using the // stride values computed above srcData = ( UInt32 *)(( char *) srcData + srcStride); outData = ( UInt32 *)(( char *) outData + outStride); // increment row count row++; } if (info > mEditMode == 1 ) { for ( int movy= info > mEditSize + 1 ; movy mEditSize ; movy++)

PAGE 125

! "#$ for ( int movx= info > mEditSize + 1 ;movx mEditSize ;movx++) for ( int i = 0 ; i < (info > mVerticesCount / 2 ); i++) { // to add edit dots to vertices// int x = info > mCoordinates [(i 2 )]*outBuf > mWidth / 100.0 +movx; if (x > outBuf > mWidth ) continue ; int y = info > mCoordinates [(i 2 ) + 1 ]*outBuf > mHeight / 100.0 +movy; if (y > outBuf > mHeight ) continue ; if ((x < 0 ) || (y < 0 )) continue ; outData = static_cast < UInt32 *>(outBuf > mBaseAddress ); outData = ( UInt32 *) ( ( char *) outData + (( (y outBuf > mWidth sizeof ( UInt32 )) + (outStride y)) ) ); outD ata += x; if (info > mCurrentVertexIndex == i) { *(outData) = ARGB_ ( 255 0 255 0 ); } else { *(outData) = ARGB_ ( 255 255 0 0 ); } } } } // --------------------------------------------------------------------------------// ReceiveMessage // --------------------------------------------------------------------------------// Isadora broadcasts messages to its Message Receives depending on what message // they are listening to. In this case, we are listening for kWantVid eoFrameTick, // which is broadcast periodically (30 times per second.) When we receive the // message, we check to see if our video frame needs to be updated. If so, we // process the incoming video and pass the newly generated frame to the output. static void ReceiveMessage( IsadoraParameters ip, MessageMask /* inMessageMask */ // the message that caused this ReceiveMessage PortIndex /* inIndex1 */ // for MIDI messages, the port where msg arrived. const MsgData /* inData */ // the data associated with this message UInt32 /* inLen */ // the length of the data associated w/ message long inRefCon) // in our use, actually the pointer to ActorInfo { // Convert the refCon into the ActorInfo* that it // really is, so that we can get at our data ActorInfo actorInfo = reinterpret_cast < ActorInfo *>(inRefCon); // get pointer to plugin info PluginInfo info = GetPluginInfo_ (actorInfo); // We use this Value struct in a few places below... Value v = { kData nil }; // set a flag to remember if we had an output buffer before we // called UpdateImageBufferMap Boolean wasOutputBuffer = info > mImageBufferMap mOutputBuffersValid ; // ensure that the ImageBufferMap is up valid for the // current input Image Buffer UpdateImageBufferMap (ip, &info > mImageBufferMap ); // use GetImageBufferPtr to get the input and output buffers

PAGE 126

! "#$ ImageBufferPtr img1 = GetImageBufferPtr (ip, &info > mImageBufferMap 0 ); ImageBufferPtr out = GetOutputImageBufferPtr (&info > mImageBu fferMap 0 ); // if we don't have a valid output buffer if (info > mImageBufferMap mOutputBuffersValid == false ) { // if there was an output buffer preivously, we need to // send a 'nil' buffer to let other modules know that // our ouptut is now invalid if (wasOutputBuffer) { v. u data = nil ; SetOutputPropertyValue_ (ip, actorInfo, kOutputVideo &v); } // otherwise, if our mNeedsDraw flag is true, and if we have both // and input buffer and an output buffer, then we can proceed to // process the video image // // we only draw the image if the following are true: // 1) the mNeedsDraw variable is set to true (this is set in the // InputPropertyChangeValue callback above.) // 2) the input image buffer (img1) is not nil // 3) the output image buffer (out) is not nil } else if (info > mNeedsDraw && img1 != nil && out != nil ) { // call EnterVideoProcessing_ so that Isadora will accumulate the // amount of time spent processing the video data this is not // requried by highly recommended so that the VPO value in the // Status Window stays accurate. UInt64 vpStart = EnterVideoProcessing_ (ip); // clear the mNeedsDraw flag info > mNeedsDraw = false ; // assume for the moment that we won't draw the frame // set this value to true if we change the output Boolean drawFrame = false ; // we only process 32 bit data in this plugin if (out > mBitDepth == 32 ) { ProcessVideoFrame (ip, info, img1, out); drawFrame = true ; } // if the drawFrame flag got set, then we need to output the // new video data to our output port here. if (drawFrame) { // IMPORTANT: We have changed the data in the output buffer // so we need to increment the data change count so that // those looking at our data will know that there is new // data in the buffer out > mInfo mDataChangeCount ++; v. u data = out; // send the new video frame to the video output property SetOutputPropertyValue_ (ip, info > mActorInfoPtr kOutputVi deo &v); } // make sure to compliment EnterVideoProcessing_ with // and ExitVideoProcessing_ call ExitVideoProcessing_ (ip, vpStart); } }

PAGE 127

APPENDIX C: Installation Photo Gallery 3D model in Maya

PAGE 128

! "#$ View of projector image directions onto installations View of Christie projector pla cement in REVE: Digital Worlds Institute.

PAGE 129

! "#$ Stage Left Center Installation "The Tall One" Far Stage Left Installation "The Arch"

PAGE 130

! "#$ Stage Right Installation "The Mound" Configuration: Closed Stage Right Installation "The Mound" Configuration: Open Stage R ight Installation "The Mound" Configuration: Throne Stage Right Installation "The Mound" Configuration: Split

PAGE 131

! "#" Construction Stage Right Installation "The Mound" Configuration: Closed Stage Left Center Installation Stage Left Installation: "The Tall One" The Arch"

PAGE 132

! "#$ Stage Right Installation "The Mound" Configuration: Open Stage Left Installations "The Tall One" and "The Arch"

PAGE 133

! "## Mapping

PAGE 135

! "#$ Initial mapping session (closed configuration) Final screenshot of vertex points (closed configuration)

PAGE 136

! "#$ Precision mapping with value sliders (throne configuration) Final screenshot of vertex points (throne configuration)

PAGE 137

APPENDIX D: The Last Unicorn Production Photo Gallery Stage Right Installation (open configuration) Mommy Fortuna Magic Stage Left Installations Magic Heart

PAGE 138

! "#$ Stage Right Installation (closed configuration) Unicorn Forest Stage Left Installations Balcony

PAGE 139

! "#$ Stage Right Installation (throne configuration) Throne Room Full Stage Woods

PAGE 140

! "#$ Full Stage Carnival Stage Left Installations Along the Road

PAGE 141

! "#" Full Stage Bedchamber Full Stage Bull Fire

PAGE 142

! "#$ Full Stage Unicorn Waves Full Stage Balcony

PAGE 143

! "#$ LIST OF REFERENCES Beagle, Peter S.. The Last Unicorn London: Penguin Books, 1968, 1991. Print. Beagle, Peter S.. The Last Unicorn San Diego, CA: IDW Pub., 2011. Graphic Novel. Beck, Julian. "Our Mission | Living Theatre." Home | Living Theatre Web. 17 Oct. 2011. < http://www.livingtheatre.org/about/ourmission >. Bloom, Harold. Tennessee Williams's The Glass Menagerie New York: Chelsea House, 2007. Print. Borovay, Zachary. "I Am Not a Lig hting Designer." Live Design (2006). l ivedesignonline.com Live Design. Web. 15 June 2010. < http://livedesignonline.com/mag/am_not_lighting/?smte=wl >. Brecht, Bertolt, and John Will ett. Brecht on Theatre; the Development of an Aesthetic. New York: Hill and Wang, 1964. Print. Bursill, Henry. "Shadow Puppets: A Pupet Shadow Comparison." Shadow Puppets: A Series of Novel and Amusing Figures Formed by the Hand Shadow Puppets.com, 2006. Web. 1 Oct. 2011. < http://www.shadow puppets.com/Asian_and_European_Puppetry_Comparison.html >. "Christie Projectors: Christie DHD670 E DLP Projector." Projectors, Projector Reviews, LCD Projectors, Home Theater DLP Projectors at ProjectorCentral.com Web. 25 July 2011. < http://www.projectorcentral.com/Christie DHD670 E.htm >. Dawson, Gary Fisher. Documentary Theatre in the United States: A n Historical Survey and Analysis of Its Content, Form, and Stagecraft Westport, CT: Greenwood, 1999. Print. Dodson, Bryan, and Michelle Dodson. "Projection Mapping Introduction." Web log post. Video Mapping Blo g Integrated Visions Productions, 2010. Web. 12 Sept. 2011. < http://videomapping.tumblr.com />. Esaak, Shelley. "Dada Art History Basics on the Dada Movement 1916 1923." Art History Resources for Student s, Enthusiasts, Artists and Educators Artist Biographies Art Timelines Images and Picture Galleries Web. 29 Oct. 2011. < http://arthistory.about.com/cs/arthistory10one/a/dada.h tm >. Finley, Darrel Rex. "Point In Polygon Algorithm Determining Whether A Point Is Inside A Complex Polygon." Alien Ryder Flex: The WWW Homepage of Darrel Rex Finley 2007. Web. 30 May 2011. < http://al ienryderflex.com/polygon />.

PAGE 144

! "## Hopkins, David. Dada and Surrealism a Very Short Introduction. London: Oxford University Press, 2004. Print. Innes, C. D. Erwin Piscator's Political Theatre: the Development of Modern German Drama Cambridge: Cambridge UP, 1 972. Print. "Jitter.h." Opengl.org Silicon Graphics, Inc. Web. 07 Sept. 2011. < http://www.opengl.org/resources/code/samples/redbook/jitter.h >. Kramer, Richard E. ""The Sculpt ural Drama": Tennessee Williams's Plastic Theatre." The Tennessee Williams Annual Review 2001. Web. 10 Sept. 2011. < http://www.tennesseewilliamsstudies.org/archives/200 2/3kramer_print.ht m >. Leverich, Lyle. Tom: The Unknown Tennessee Williams New York: Norton, 1995. Print. LoBiondo, Maria. "Emily Mann, Artistic Director of McCarter Theatre." Princeton Online Web. 07 Oct. 2011. < http://www.princetonol.com/patron/emann />. Piscator, Erwin. "The Berlin Production of Paquet's Flags." Essays on German Theater Comp. Margaret Herzfeld Sander. New York: Continuum, 1985. 182 85. Print. Probst, Gerhard F. Erwin Piscator and the American Theatre New York: P. Lang, 1991. Print. Savran, David. The Wooster Group, 1975 1985: Breaking the Rules Ann Arbor, Michigan: UMI Research Press, 1986. Print. Szanto, George H. "Information, Distortion, Propaganda: Control Factors in Technological Societies." Theater & Propaganda Austin: University of Texas, 1978. 23+. Print. The Builders Association Web. 17 Oct. 2011. < http://www.thebuildersassociation.org/abo ut_mission.html >. The Last Unicorn Dir. Jules Bass and Arthur Rankin Jr. Perf. Mia Farrow, Jeff Bridges and Angela Lansbury. Jensen Farley Pictures, 1982. Videocassette. Tytell, John. The Living Theatre: Art, Exile, and Outrage New York: Grove, 1995. Print.

PAGE 145

! "#$ PROJECTION DESIGNERS' PERSONAL WEBSITES AND COMPANY WEBSITE S 1024 1024 Architecture / Creative Label / Art Installation / Video Mapping / Ex EXYZT Web. 14 Sept. 2011. < http://www.1024architect ure.net >. Peter Flaherty. The Four / / The Five Web. 20 Sept. 2011. < http://www.thefourthefive.org/peter.html >. V Squared Labs Web. 15 Aug. 2011. < http:// vsquaredlabs.com >. Wendall Harrington. Web. 13 July 2011. < http://www.wendallharrington.com >. Zachary Borovay Projection Designer Web. 05 June 2011. < http://www.bo rovay.com >. PROJECTION SOFTWARE AND OTHER RELATED WEBSITES Ableton Homepage Web. 13 Aug. 2011. < http://www.ableton.com >. Cycling 74 Web. 17 Oct. 2011. < http://cycling74.co m />. Dataton Watchout Web. 10 Oct. 2011. < http://www.dataton.com/watchout >. "Figure 53 | QLab | Live Show Control for Mac OS X." Figure 53 | QCart | Audio Cart for Mac OS X Web. 17 Oct. 2011. < http://figure53.com/qlab >. FreeFrame Open Realtime Video Effects Web. 20 Oct. 2011. < http://freeframe.sourceforge.net/about.html >. Green Hippo The Worlds Be st Solution for Realtime Video Playback on Events Worldwide Home Web. 17 Oct. 2011. < http://www.green hippo.com/>. "Iduun Releases the Much Anticipated MapMapMap Module Modul8 Blog." Modul8 Blog Web. 1 7 Oct. 2011. < http://www.modul8.us/?p=428 >. MadMapper | The Video Mapping Software Web. 16 Oct. 2011. < http://www.madmapper.com />. Modul8 VJ Software Web. 19 Oct. 2011 < http://www.modul8.ch >. Quartz Composer Web. 17 Oct. 2011. < http://quartzcomposer.com />. "Resolume VJ Software Resolume Avenue 3 Features." Resolume VJ Software Live Digital Motion Graphics Web. 17 Oct. 2011. < http://www.resolume.com/avenue/features.php >.

PAGE 146

! "#$ Syphon Web. 15 Oct. 2011. < http://syphon.v002.info >. "TroikaT ronix Isadora." TroikaTronix Home Web. 03 June 2011. < http://www.troikatronix.com/isadora.html >. "VPT 6.0 Conversations with Spaces." Conversations with Spaces Web. 04 Sept. 2011. < http://hcgilje.wordpress.com/vpt >. WIKIPEDIA SOURCES "Incandescent Light Bulb." Wikipedia, the Free Encyclopedia Web. 20 Oct. 2011. < http://en.w ikipedia.org/wiki/Incandescent_light_bulb >. "Magic Lantern." Wikipedia, the Free Encyclopedia Web. 7 Oct. 2011. < http://en.wikipedia.org/wiki/Magic_lantern >. "MIDI." Wikipedia, the Free Encyclo pedia Web. 16 Aug. 2011. < http://en.wikipedia.org/wiki/MIDI >. "MoirŽ Pattern." Wikipedia, the Free Encyclopedia Web. 20 Oct. 2011. < http://en.wikipedia.o rg/wiki/Moire >. "Spatial Anti aliasing." Wikipedia, the Free Encyclopedia Web. 18 Aug. 2011. < http://en.wikipedia.org/wiki/Anti aliasing >. "Supersampling." Wikipedia, the Free Encyclopedia Web 20 Sept. 2011. < http://en.wikipedia.org/wiki/Supersampling >. PHOTOGRAPHY CREDITS MoirŽ on Parrot Feathers.jpg." Wikipedia, the Free Encyclopedia Web. 20 Oct. 2011. < http://en.wikipedia.org/wiki/File:Moire_on_parrot_feathers.jpg >. Photograph licensed under the GNU Free Documentation License at < http://en.wikipedia.org/wiki/GNU_Free_Documentation_Licens e >. Projection software graphical users interface images attributed to the respective companies. Further information can be found at the individual companies websites found in the Projection So ftware Websites section of the List of References. Images of the installations provided by author, Brittany Powell.

PAGE 147

! "#$ %&'()*+,&-*.!/012-, %3455678!+9:;<937!97!?9@;A>;3!BC! "BDE!47!F6GH=97@4<<;C!I<934J6K! /L;! 4=!5L;! 89M7N;3 9O!5:9!J6MNL5;3=!67J!N36JM65;J!O39A!P67J6347!,4NL!/GL9934JN;!V74@;3=458Z=!&75;37654976;3!QRRB!67J!>;N67!N36JM65;!=GL99M5!L;3!O43=5!=;A;=5;3!599H!5L3;;!G9M3=;=!65!5L;![4N456347N47N!5L;=;!5;GL79<9N4;=!59!6<