Skip to content

Commit a4da778

Browse files
committed
update video thumbs
1 parent 6ed39a8 commit a4da778

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+202
-184
lines changed

_publications/2025-audio_visual_notifications.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
- Notifications
3030
venue: IEEE VR
3131

32-
#video-thumb: d_XVt42hnqY
32+
video-thumb: i2mllqxa5_4
3333
#video-30sec: d_XVt42hnqY
3434
video-suppl: i2mllqxa5_4
3535
#video-talk-5min: l9ycUrf50TE

_publications/2025-minimates.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
- Telepresence
3030
venue: ACM CHI
3131

32-
#video-thumb: 7K3eouLCcSw
32+
video-thumb: fukfCSvmo44
3333
#video-30sec: 7K3eouLCcSw
3434
video-suppl: fukfCSvmo44
3535
#video-talk-5min: l9ycUrf50TE

_publications/2025-persistant_assistant.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
- Adaptive User Interfaces
3232
venue: ACM CHI
3333

34-
#video-thumb: 7K3eouLCcSw
34+
video-thumb: eOUbB9NwqZw
3535
#video-30sec: 7K3eouLCcSw
3636
video-suppl: eOUbB9NwqZw
3737
#video-talk-5min: l9ycUrf50TE

_publications/2025-sensing_noticeability.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
- Adaptive User Interfaces
2727
venue: ACM CHI
2828

29-
#video-thumb: 7K3eouLCcSw
29+
video-thumb: h1367g
3030
#video-30sec: 7K3eouLCcSw
3131
video-suppl: Q9I8-h1367g
3232
#video-talk-5min: l9ycUrf50TE

_site/feed.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.9.0">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2025-04-28T18:13:00-05:00</updated><id>http://localhost:4000/feed.xml</id><title type="html">CMU Augmented Perception Lab</title><subtitle>Augmented Perception Lab at Carnegie Mellon University in Pittsburgh.</subtitle></feed>
1+
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.9.0">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2025-04-28T18:18:03-05:00</updated><id>http://localhost:4000/feed.xml</id><title type="html">CMU Augmented Perception Lab</title><subtitle>Augmented Perception Lab at Carnegie Mellon University in Pittsburgh.</subtitle></feed>

_site/index.html

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -169,6 +169,11 @@ <h2 class="f3">Selected Publications</h2>
169169
style="background-image: url('/assets/publications/2025-sensing_noticeability_thumb.png')">
170170

171171

172+
<iframe width="400" height="240" class="thumb-video"
173+
src="https://www.youtube.com/embed/h1367g?iv_load_policy=3&modestbranding=1&rel=0&autohide=1&playsinline=1&controls=1&showinfo=0&autoplay=0&loop=0&mute=1"
174+
frameborder="0" allowfullscreen>
175+
</iframe>
176+
172177
</div>
173178

174179
<div class="measure-wide mv3 min-width-front">
@@ -203,6 +208,11 @@ <h3 class="mt0">
203208
style="background-image: url('/assets/publications/2025-minimates_thumb.png')">
204209

205210

211+
<iframe width="400" height="240" class="thumb-video"
212+
src="https://www.youtube.com/embed/fukfCSvmo44?iv_load_policy=3&modestbranding=1&rel=0&autohide=1&playsinline=1&controls=1&showinfo=0&autoplay=0&loop=0&mute=1"
213+
frameborder="0" allowfullscreen>
214+
</iframe>
215+
206216
</div>
207217

208218
<div class="measure-wide mv3 min-width-front">

_site/publications.html

Lines changed: 52 additions & 44 deletions
Large diffs are not rendered by default.

_site/publications/2013-suggero.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2013-suggero.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-04-28T18:13:00-05:00" />
53+
<meta property="article:published_time" content="2025-04-28T18:18:03-05:00" />
5454
<script type="application/ld+json">
55-
{"author":{"@type":"Person","name":"David Lindlbauer"},"url":"http://localhost:4000/publications/2013-suggero.html","description":"Modifying a digital sketch may require multiple selections before a particular editing tool can be applied. Especially on large interactive surfaces, such interactions can be fatiguing. Accordingly, we propose a method, called Suggero, to facilitate the selection process of digital ink. Suggero identifies groups of perceptually related drawing objects. These “perceptual groups” are used to suggest possible extensions in response to a person’s initial selection. Two studies were conducted. First, a background study investigated participant’s expectations of such a selection assistance tool. Then, an empirical study compared the effectiveness of Suggero with an existing manual technique. The results revealed that Suggero required fewer pen interactions and less pen movement, suggesting that Suggero minimizes fatigue during digital sketching.","@type":"BlogPosting","headline":"Perceptual grouping: selection assistance for digital sketching","dateModified":"2025-04-28T18:13:00-05:00","datePublished":"2025-04-28T18:13:00-05:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2013-suggero.html"},"@context":"https://schema.org"}</script>
55+
{"author":{"@type":"Person","name":"David Lindlbauer"},"url":"http://localhost:4000/publications/2013-suggero.html","description":"Modifying a digital sketch may require multiple selections before a particular editing tool can be applied. Especially on large interactive surfaces, such interactions can be fatiguing. Accordingly, we propose a method, called Suggero, to facilitate the selection process of digital ink. Suggero identifies groups of perceptually related drawing objects. These “perceptual groups” are used to suggest possible extensions in response to a person’s initial selection. Two studies were conducted. First, a background study investigated participant’s expectations of such a selection assistance tool. Then, an empirical study compared the effectiveness of Suggero with an existing manual technique. The results revealed that Suggero required fewer pen interactions and less pen movement, suggesting that Suggero minimizes fatigue during digital sketching.","@type":"BlogPosting","headline":"Perceptual grouping: selection assistance for digital sketching","dateModified":"2025-04-28T18:18:03-05:00","datePublished":"2025-04-28T18:18:03-05:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2013-suggero.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

_site/publications/2014-chair.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2014-chair.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-04-28T18:13:00-05:00" />
53+
<meta property="article:published_time" content="2025-04-28T18:18:03-05:00" />
5454
<script type="application/ld+json">
55-
{"author":{"@type":"Person","name":"Kathrin Probst"},"url":"http://localhost:4000/publications/2014-chair.html","description":"During everyday office work we are used to controlling our computers with keyboard and mouse, while the majority of our body remains unchallenged and the physical workspace around us stays largely unattended. Addressing this untapped potential, we explore the concept of turning a flexible office chair into a ubiquitous input device. To facilitate daily desktop work, we propose the utilization of semaphoric chair gestures that can be assigned to specific application functionalities. The exploration of two usage scenarios in the context of focused and peripheral interaction demonstrates high potential of chair gestures as additional input modality for opportunistic, hands-free interaction.","@type":"BlogPosting","headline":"A chair as ubiquitous input device: exploring semaphoric chair gestures for focused and peripheral interaction","dateModified":"2025-04-28T18:13:00-05:00","datePublished":"2025-04-28T18:13:00-05:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-chair.html"},"@context":"https://schema.org"}</script>
55+
{"author":{"@type":"Person","name":"Kathrin Probst"},"url":"http://localhost:4000/publications/2014-chair.html","description":"During everyday office work we are used to controlling our computers with keyboard and mouse, while the majority of our body remains unchallenged and the physical workspace around us stays largely unattended. Addressing this untapped potential, we explore the concept of turning a flexible office chair into a ubiquitous input device. To facilitate daily desktop work, we propose the utilization of semaphoric chair gestures that can be assigned to specific application functionalities. The exploration of two usage scenarios in the context of focused and peripheral interaction demonstrates high potential of chair gestures as additional input modality for opportunistic, hands-free interaction.","@type":"BlogPosting","headline":"A chair as ubiquitous input device: exploring semaphoric chair gestures for focused and peripheral interaction","dateModified":"2025-04-28T18:18:03-05:00","datePublished":"2025-04-28T18:18:03-05:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-chair.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

_site/publications/2014-tracs.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2014-tracs.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-04-28T18:13:00-05:00" />
53+
<meta property="article:published_time" content="2025-04-28T18:18:03-05:00" />
5454
<script type="application/ld+json">
55-
{"author":{"@type":"Person","name":"David Lindlbauer"},"url":"http://localhost:4000/publications/2014-tracs.html","description":"We present Tracs, a dual-sided see-through display system with controllable transparency. Traditional displays are a constant visual and communication barrier, hindering fast and efficient collaboration of spatially close or facing co-workers. Transparent displays could potentially remove these barriers, but introduce new issues of personal privacy, screen content privacy and visual interference. We therefore propose a solution with controllable transparency to overcome these problems. Tracs consists of two see-through displays, with a transparency-control layer, a backlight layer and a polarization adjustment layer in-between. The transparency-control layer is built as a grid of individually addressable transparency-controlled patches, allowing users to control the transparency overall or just locally. Additionally, the locally switchable backlight layer improves the contrast of LCD screen content. Tracs allows users to switch between personal and collaborative work fast and easily and gives them full control of transparent regions on their display.","@type":"BlogPosting","headline":"Tracs: transparency-control for see-through displays","dateModified":"2025-04-28T18:13:00-05:00","datePublished":"2025-04-28T18:13:00-05:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-tracs.html"},"@context":"https://schema.org"}</script>
55+
{"author":{"@type":"Person","name":"David Lindlbauer"},"url":"http://localhost:4000/publications/2014-tracs.html","description":"We present Tracs, a dual-sided see-through display system with controllable transparency. Traditional displays are a constant visual and communication barrier, hindering fast and efficient collaboration of spatially close or facing co-workers. Transparent displays could potentially remove these barriers, but introduce new issues of personal privacy, screen content privacy and visual interference. We therefore propose a solution with controllable transparency to overcome these problems. Tracs consists of two see-through displays, with a transparency-control layer, a backlight layer and a polarization adjustment layer in-between. The transparency-control layer is built as a grid of individually addressable transparency-controlled patches, allowing users to control the transparency overall or just locally. Additionally, the locally switchable backlight layer improves the contrast of LCD screen content. Tracs allows users to switch between personal and collaborative work fast and easily and gives them full control of transparent regions on their display.","@type":"BlogPosting","headline":"Tracs: transparency-control for see-through displays","dateModified":"2025-04-28T18:18:03-05:00","datePublished":"2025-04-28T18:18:03-05:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-tracs.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

0 commit comments

Comments
 (0)