Difference between revisions of "Wikidocumentary"
|Line 76:||Line 76:|
Transferred from [[Map]]
Transferred from [[Map]]
Revision as of 09:50, 3 November 2021
Wikidocumentary is is a layered, time-based presentation that comprises of simple elements, and it can be viewed in a few different ways.
- 1 Elements
- 2 Modes of experiencing
- 3 Interactivity
- 4 Different editor views
- 5 Map-based wikidocumentary
- 6 Navigation
Time-based media as a backbone and between frames
- Audio track. The audio track can be a narrative, a musical piece, or an ambient recording for example.
- File. Audio file or the audio track of a video file.
- Record audio. The audio can originally be derived from another element, such as reading of a letter.
- Time series material (Wikidata query, file)
- Time series data. Need examples (Use cases: Travel route, any animated thematic, statistical map).
Frames are moments on the timeline of a wikidocumentary. If there is no backbone timeline, the frames follow each other freely. They can also be specific "timecodes" on the main timeline.
Media for the page layout
- Image or video. Later, multiple media files and an image sequence may be made possible.
- Audio soundscape loop / generative sound to be played while user is in the frame.
- A piece of text forms the content of a frame
- Links are possible.
- Text can be recorded, or screenread.
- A timeline of data items can be used as the set of frames
- Data items representing historical events carry the timestamp of the items.
- Each frame can be depicted by a coordinate location. This can be used in different ways. It can be plotted on a map or it is a location to go to to experience AR content.
- Georeferenced imagery can be used as a background layer. It can be an old map, aerial image, aerial video
- Data plotted on the map
Modes of experiencing
- Map, with styles
- Background layers: Historical maps, aerial images, image or video overlays
- Data layers: Toponyms, POIs, shapes, lines, visualizations
- Foreground elements: Map, text, visuals (video, images)
- Soundtrack: Audio, video soundtrack, soundscape
- Transitions: Soundscapes, specific content. Data animations run and set the pace of the interval. Auto-advance vs eg scrolling
- Frames: In-depth content – reading, image series.
- In frames, time is paused, while in transitions it is running. Both can display information. Intervals may form one backbone or be separate elements between frames.
- Web interface, page layout: Scrolling or clicking between frames. Maybe the user can change the layout and mode?
- Sit back cinema (auto-advance), full-screen, timed
- Mobile orienteering mode. Frames are located in places where it's possible to enter. There must be flexibility in the timelines to allow for different transition times. Use soundscapes and other media loops, trigger frames by proximity etc.
Ways of using AR
- The map background is used in navigating between frames. Frames can be accessed in random order.
- The frames can display AR content. Ideas?
- Things that the user can be asked to do: Take pictures, record a fact, record audio...
- Wikidocumentaries as gradually evolving presentations: Users can add an aspect to an existing wikidocumentary.
Different editor views
- Editor for text
- Editor for a timeline
- Editor for a collection of images (frames)
- Editor for a map
Transferred from Map
Based on frames that consist of several parameters
- Coordinate location
- View angle
- Time value
- Visual media
- Image plane and view angle for overlays (on the map background, in 3D space)
- Medium (video, image, image series, image pair)
- Headline & subheader
- Fixed timeline throughout presentation, such as a narrative
- Fixed timeline over one frame
- Soundscape loops or generative soundscapes
- Effects, synchronized sound
- Map settings
- Map style
- Timeline display
- Map is the base element
- Each frame represents a point from which the map is seen
- Map settings are different for each map frame
- An aerial image/video/illustration may overlay the map
- A media element may be added to the 3D view
- An image pair can be viewed with a revealing slider
- Map remains controllable in frames
- Text can be scrollable including parallax elements.
- Media can be used as part of layout in the text stream. Media can be expanded to cover the full screen.
- Transition from frame to the other may also different, such as a crossfade between media elements.
- Frame display triggered by GPS info
- Dynamic output
- soundscape, colours, alternative content, nearby geotagged content
- depending on data
- proximity to a coordinate location
- weather data
- time of day, time of year...
- Interface to create frames and configure settings
- AI-assisted frame generation from Wikipedia articles
- User-enhanced in the same interface
- Animations of time-series, how do they relate to frames? Can frames be marked as time intervals, or points in time?
- Can the wikidocumentaries be multidimensional, including for example temporal depth across its own timeline.
|About||Technology||Design||Content modules||Tool pages||Projects|
|Setting up dev environment||
|Visual editor||Central Park Archives|