Wikidocumentary is is a layered, time-based presentation that comprises of simple elements, and it can be viewed in a few different ways.
- 1 What would a wikidocumentary look like?
- 2 Elements
- 3 Different editor views
- 4 Interactivity
- 5 Map-based wikidocumentary
- 6 Old graphics
- 7 Navigation
What would a wikidocumentary look like?
- Map, with styles
- Background layers: Historical maps, aerial images, image or video overlays
- Data layers: Toponyms, POIs, shapes, lines, visualizations
- Foreground elements: Map, text, visuals (video, images)
- Soundtrack: Audio, video soundtrack, soundscape
- Transitions: Soundscapes, specific content. Data animations run and set the pace of the interval. Auto-advance vs eg scrolling
- Frames: In-depth content – reading, image series.
- In frames, time is paused, while in transitions it is running. Both can display information. Intervals may form one backbone or be separate elements between frames.
- Web interface, page layout: Scrolling or clicking between frames. Maybe the user can change the layout and mode?
- Sit back cinema (auto-advance), full-screen, timed
- Mobile orienteering mode. Frames are located in places where it's possible to enter. There must be flexibility in the timelines to allow for different transition times. Use soundscapes and other media loops, trigger frames by proximity etc.
Frames are the building blocks of a wikidocumentary.
- Image or video. Later, multiple media files and an image sequence may be made possible.
- Frame audio can be a soundscape or a loop / generative sound to be played while user is in the frame. It can be any sound that plays only once.
- Frame title
- Text, that can contain links
- Location. Each frame can be depicted by a coordinate location. This can be used in different ways. It can be plotted on a map or it is a location to go to to experience AR content.
- Background layer Georeferenced imagery can be used as a background layer. It can be an old map, aerial image, aerial video, or alternatively a large zoomable image.
- Data can be geographic data plotted on the map or annotations on an image or a map. A data point in a time series can be the basis for a frame.
- Audio track. The audio track can be a narrative, a musical piece, or an ambient recording for example. Frames are anchored on the timeline. It can be an audio file or the audio track of a video file, or the audio can be recorded in the app.
- Time series data (Wikidata query, file). Examples include travel route, any animated thematic, statistical map.
Different editor views
Based on key source material
- Editor for text
- Editor for a timeline
- Editor for a collection of images, video or frames
- Editor for a map
- Different screen estate
- Possibility to easily record images, sound, video or location
- Possibility to connect to other contributors or share
Ways of using AR
- The map background is used in navigating between frames. Frames can be accessed in random order.
- The frames can display AR content. Ideas?
- Things that the user can be asked to do: Take pictures, record a fact, record audio...
- Wikidocumentaries as gradually evolving presentations: Users can add an aspect to an existing wikidocumentary.
Transferred from Map
Based on frames that consist of several parameters
- Coordinate location
- View angle
- Time value
- Visual media
- Image plane and view angle for overlays (on the map background, in 3D space)
- Medium (video, image, image series, image pair)
- Headline & subheader
- Fixed timeline throughout presentation, such as a narrative
- Fixed timeline over one frame
- Soundscape loops or generative soundscapes
- Effects, synchronized sound
- Map settings
- Map style
- Timeline display
- Map is the base element
- Each frame represents a point from which the map is seen
- Map settings are different for each map frame
- An aerial image/video/illustration may overlay the map
- A media element may be added to the 3D view
- An image pair can be viewed with a revealing slider
- Map remains controllable in frames
- Text can be scrollable including parallax elements.
- Media can be used as part of layout in the text stream. Media can be expanded to cover the full screen.
- Transition from frame to the other may also different, such as a crossfade between media elements.
- Frame display triggered by GPS info
- Dynamic output
- soundscape, colours, alternative content, nearby geotagged content
- depending on data
- proximity to a coordinate location
- weather data
- time of day, time of year...
- Interface to create frames and configure settings
- AI-assisted frame generation from Wikipedia articles
- User-enhanced in the same interface
- Animations of time-series, how do they relate to frames? Can frames be marked as time intervals, or points in time?
- Can the wikidocumentaries be multidimensional, including for example temporal depth across its own timeline.
|About||Technology||Design||Content modules||Tool pages||Projects|
|Setting up dev environment||
|Visual editor||Central Park Archives|