Skip to content

Commit 179e11d

Browse files
committed
update people
1 parent e29ea52 commit 179e11d

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+185
-191
lines changed

Gemfile.lock

Lines changed: 3 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -222,17 +222,9 @@ GEM
222222
minitest (5.25.1)
223223
net-http (0.4.1)
224224
uri
225-
nokogiri (1.18.8)
225+
nokogiri (1.17.2)
226226
mini_portile2 (~> 2.8.2)
227227
racc (~> 1.4)
228-
nokogiri (1.18.8-arm64-darwin)
229-
racc (~> 1.4)
230-
nokogiri (1.18.8-x64-mingw-ucrt)
231-
racc (~> 1.4)
232-
nokogiri (1.18.8-x86_64-darwin)
233-
racc (~> 1.4)
234-
nokogiri (1.18.8-x86_64-linux-gnu)
235-
racc (~> 1.4)
236228
octokit (4.25.1)
237229
faraday (>= 1, < 3)
238230
sawyer (~> 0.9)
@@ -267,6 +259,7 @@ GEM
267259
thread_safe (~> 0.1)
268260
unicode-display_width (1.8.0)
269261
uri (0.13.1)
262+
wdm (0.2.0)
270263
webrick (1.8.2)
271264
zeitwerk (2.6.18)
272265

@@ -282,6 +275,7 @@ DEPENDENCIES
282275
github-pages (~> 208)
283276
kramdown (>= 2.3.0)
284277
listen (~> 3.2)
278+
wdm (>= 0.1.0)
285279
webrick (~> 1.8)
286280

287281
BUNDLED WITH

_data/people.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@
4141

4242
- name: Aashna Kulshrestha
4343
website: https://www.linkedin.com/in/aashna-kulshrestha/
44-
image: /assets/person.png
44+
image: /assets/people/aashna-kulshrestha.jpg
4545
role: Undergraduate student
4646

4747
- name: Jose Lima
544 KB
Loading

_site/feed.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.9.0">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2025-06-09T16:28:11-04:00</updated><id>http://localhost:4000/feed.xml</id><title type="html">CMU Augmented Perception Lab</title><subtitle>Augmented Perception Lab at Carnegie Mellon University in Pittsburgh.</subtitle></feed>
1+
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.9.0">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2025-06-09T16:46:50-04:00</updated><id>http://localhost:4000/feed.xml</id><title type="html">CMU Augmented Perception Lab</title><subtitle>Augmented Perception Lab at Carnegie Mellon University in Pittsburgh.</subtitle></feed>

_site/publications.html

Lines changed: 44 additions & 44 deletions
Large diffs are not rendered by default.

_site/publications/2013-suggero.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2013-suggero.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-06-09T16:28:11-04:00" />
53+
<meta property="article:published_time" content="2025-06-09T16:46:50-04:00" />
5454
<script type="application/ld+json">
55-
{"headline":"Perceptual grouping: selection assistance for digital sketching","dateModified":"2025-06-09T16:28:11-04:00","datePublished":"2025-06-09T16:28:11-04:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2013-suggero.html"},"author":{"@type":"Person","name":"David Lindlbauer"},"@type":"BlogPosting","url":"http://localhost:4000/publications/2013-suggero.html","description":"Modifying a digital sketch may require multiple selections before a particular editing tool can be applied. Especially on large interactive surfaces, such interactions can be fatiguing. Accordingly, we propose a method, called Suggero, to facilitate the selection process of digital ink. Suggero identifies groups of perceptually related drawing objects. These “perceptual groups” are used to suggest possible extensions in response to a person’s initial selection. Two studies were conducted. First, a background study investigated participant’s expectations of such a selection assistance tool. Then, an empirical study compared the effectiveness of Suggero with an existing manual technique. The results revealed that Suggero required fewer pen interactions and less pen movement, suggesting that Suggero minimizes fatigue during digital sketching.","@context":"https://schema.org"}</script>
55+
{"headline":"Perceptual grouping: selection assistance for digital sketching","dateModified":"2025-06-09T16:46:50-04:00","datePublished":"2025-06-09T16:46:50-04:00","@type":"BlogPosting","url":"http://localhost:4000/publications/2013-suggero.html","author":{"@type":"Person","name":"David Lindlbauer"},"description":"Modifying a digital sketch may require multiple selections before a particular editing tool can be applied. Especially on large interactive surfaces, such interactions can be fatiguing. Accordingly, we propose a method, called Suggero, to facilitate the selection process of digital ink. Suggero identifies groups of perceptually related drawing objects. These “perceptual groups” are used to suggest possible extensions in response to a person’s initial selection. Two studies were conducted. First, a background study investigated participant’s expectations of such a selection assistance tool. Then, an empirical study compared the effectiveness of Suggero with an existing manual technique. The results revealed that Suggero required fewer pen interactions and less pen movement, suggesting that Suggero minimizes fatigue during digital sketching.","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2013-suggero.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

_site/publications/2014-chair.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2014-chair.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-06-09T16:28:11-04:00" />
53+
<meta property="article:published_time" content="2025-06-09T16:46:50-04:00" />
5454
<script type="application/ld+json">
55-
{"headline":"A chair as ubiquitous input device: exploring semaphoric chair gestures for focused and peripheral interaction","dateModified":"2025-06-09T16:28:11-04:00","datePublished":"2025-06-09T16:28:11-04:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-chair.html"},"author":{"@type":"Person","name":"Kathrin Probst"},"@type":"BlogPosting","url":"http://localhost:4000/publications/2014-chair.html","description":"During everyday office work we are used to controlling our computers with keyboard and mouse, while the majority of our body remains unchallenged and the physical workspace around us stays largely unattended. Addressing this untapped potential, we explore the concept of turning a flexible office chair into a ubiquitous input device. To facilitate daily desktop work, we propose the utilization of semaphoric chair gestures that can be assigned to specific application functionalities. The exploration of two usage scenarios in the context of focused and peripheral interaction demonstrates high potential of chair gestures as additional input modality for opportunistic, hands-free interaction.","@context":"https://schema.org"}</script>
55+
{"headline":"A chair as ubiquitous input device: exploring semaphoric chair gestures for focused and peripheral interaction","dateModified":"2025-06-09T16:46:50-04:00","datePublished":"2025-06-09T16:46:50-04:00","@type":"BlogPosting","url":"http://localhost:4000/publications/2014-chair.html","author":{"@type":"Person","name":"Kathrin Probst"},"description":"During everyday office work we are used to controlling our computers with keyboard and mouse, while the majority of our body remains unchallenged and the physical workspace around us stays largely unattended. Addressing this untapped potential, we explore the concept of turning a flexible office chair into a ubiquitous input device. To facilitate daily desktop work, we propose the utilization of semaphoric chair gestures that can be assigned to specific application functionalities. The exploration of two usage scenarios in the context of focused and peripheral interaction demonstrates high potential of chair gestures as additional input modality for opportunistic, hands-free interaction.","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-chair.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

_site/publications/2014-tracs.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2014-tracs.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-06-09T16:28:11-04:00" />
53+
<meta property="article:published_time" content="2025-06-09T16:46:50-04:00" />
5454
<script type="application/ld+json">
55-
{"headline":"Tracs: transparency-control for see-through displays","dateModified":"2025-06-09T16:28:11-04:00","datePublished":"2025-06-09T16:28:11-04:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-tracs.html"},"author":{"@type":"Person","name":"David Lindlbauer"},"@type":"BlogPosting","url":"http://localhost:4000/publications/2014-tracs.html","description":"We present Tracs, a dual-sided see-through display system with controllable transparency. Traditional displays are a constant visual and communication barrier, hindering fast and efficient collaboration of spatially close or facing co-workers. Transparent displays could potentially remove these barriers, but introduce new issues of personal privacy, screen content privacy and visual interference. We therefore propose a solution with controllable transparency to overcome these problems. Tracs consists of two see-through displays, with a transparency-control layer, a backlight layer and a polarization adjustment layer in-between. The transparency-control layer is built as a grid of individually addressable transparency-controlled patches, allowing users to control the transparency overall or just locally. Additionally, the locally switchable backlight layer improves the contrast of LCD screen content. Tracs allows users to switch between personal and collaborative work fast and easily and gives them full control of transparent regions on their display.","@context":"https://schema.org"}</script>
55+
{"headline":"Tracs: transparency-control for see-through displays","dateModified":"2025-06-09T16:46:50-04:00","datePublished":"2025-06-09T16:46:50-04:00","@type":"BlogPosting","url":"http://localhost:4000/publications/2014-tracs.html","author":{"@type":"Person","name":"David Lindlbauer"},"description":"We present Tracs, a dual-sided see-through display system with controllable transparency. Traditional displays are a constant visual and communication barrier, hindering fast and efficient collaboration of spatially close or facing co-workers. Transparent displays could potentially remove these barriers, but introduce new issues of personal privacy, screen content privacy and visual interference. We therefore propose a solution with controllable transparency to overcome these problems. Tracs consists of two see-through displays, with a transparency-control layer, a backlight layer and a polarization adjustment layer in-between. The transparency-control layer is built as a grid of individually addressable transparency-controlled patches, allowing users to control the transparency overall or just locally. Additionally, the locally switchable backlight layer improves the contrast of LCD screen content. Tracs allows users to switch between personal and collaborative work fast and easily and gives them full control of transparent regions on their display.","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2014-tracs.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

_site/publications/2015-creature-teacher.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2015-creature-teacher.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-06-09T16:28:11-04:00" />
53+
<meta property="article:published_time" content="2025-06-09T16:46:50-04:00" />
5454
<script type="application/ld+json">
55-
{"headline":"Creature Teacher: A Performance-Based Animation System for Creating Cyclic Movements","dateModified":"2025-06-09T16:28:11-04:00","datePublished":"2025-06-09T16:28:11-04:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2015-creature-teacher.html"},"author":{"@type":"Person","name":"Andreas Fender"},"@type":"BlogPosting","url":"http://localhost:4000/publications/2015-creature-teacher.html","description":"We present Creature Teacher, a performance-based animation system for creating cyclic movements. Users directly manipulate body parts of a virtual character by using their hands. Creature Teacher’s generic approach makes it possible to animate rigged 3D models with nearly arbitrary topology (e.g., non-humanoid) without requiring specialized user-to-character mappings or predefined movements. We use a bimanual interaction paradigm, allowing users to select parts of the model with one hand and manipulate them with the other hand. Cyclic movements of body parts during manipulation are detected and repeatedly played back - also while animating other body parts. Our approach of taking cyclic movements as an input makes mode switching between recording and playback obsolete and allows for fast and seamless creation of animations. We show that novice users with no animation background were able to create expressive cyclic animations for initially static virtual 3D creatures.","@context":"https://schema.org"}</script>
55+
{"headline":"Creature Teacher: A Performance-Based Animation System for Creating Cyclic Movements","dateModified":"2025-06-09T16:46:50-04:00","datePublished":"2025-06-09T16:46:50-04:00","@type":"BlogPosting","url":"http://localhost:4000/publications/2015-creature-teacher.html","author":{"@type":"Person","name":"Andreas Fender"},"description":"We present Creature Teacher, a performance-based animation system for creating cyclic movements. Users directly manipulate body parts of a virtual character by using their hands. Creature Teacher’s generic approach makes it possible to animate rigged 3D models with nearly arbitrary topology (e.g., non-humanoid) without requiring specialized user-to-character mappings or predefined movements. We use a bimanual interaction paradigm, allowing users to select parts of the model with one hand and manipulate them with the other hand. Cyclic movements of body parts during manipulation are detected and repeatedly played back - also while animating other body parts. Our approach of taking cyclic movements as an input makes mode switching between recording and playback obsolete and allows for fast and seamless creation of animations. We show that novice users with no animation background were able to create expressive cyclic animations for initially static virtual 3D creatures.","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2015-creature-teacher.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

_site/publications/2015-geltouch.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,9 @@
5050
<meta property="og:url" content="http://localhost:4000/publications/2015-geltouch.html" />
5151
<meta property="og:site_name" content="CMU Augmented Perception Lab" />
5252
<meta property="og:type" content="article" />
53-
<meta property="article:published_time" content="2025-06-09T16:28:11-04:00" />
53+
<meta property="article:published_time" content="2025-06-09T16:46:50-04:00" />
5454
<script type="application/ld+json">
55-
{"headline":"GelTouch: Localized Tactile Feedback Through Thin, Programmable Gel","dateModified":"2025-06-09T16:28:11-04:00","datePublished":"2025-06-09T16:28:11-04:00","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2015-geltouch.html"},"author":{"@type":"Person","name":"Viktor Miruchna"},"@type":"BlogPosting","url":"http://localhost:4000/publications/2015-geltouch.html","description":"We present GelTouch, a gel-based layer that can selectively transition between soft and stiff to provide tactile multi-touch feedback. It is flexible, transparent when not activated, and contains no mechanical, electromagnetic, or hydraulic components, resulting in a compact form factor (a 2mm thin touchscreen layer for our prototype). The activated areas can be morphed freely and continuously, without being limited to fixed, predefined shapes. GelTouch consists of a poly(N-isopropylacrylamide) gel layer which alters its viscoelasticity when activated by applying heat (&gt;32 C). We present three different activation techniques: 1) Indium Tin Oxide (ITO) as a heating element that enables tactile feedback through individually addressable taxels; 2) predefined tactile areas of engraved ITO, that can be layered and combined; 3) complex arrangements of resistance wire that create thin tactile edges. We present a tablet with 6x4 tactile areas, enabling a tactile numpad, slider, and thumbstick. We show that the gel is up to 25 times stiffer when activated and that users detect tactile features reliably (94.8%).","@context":"https://schema.org"}</script>
55+
{"headline":"GelTouch: Localized Tactile Feedback Through Thin, Programmable Gel","dateModified":"2025-06-09T16:46:50-04:00","datePublished":"2025-06-09T16:46:50-04:00","@type":"BlogPosting","url":"http://localhost:4000/publications/2015-geltouch.html","author":{"@type":"Person","name":"Viktor Miruchna"},"description":"We present GelTouch, a gel-based layer that can selectively transition between soft and stiff to provide tactile multi-touch feedback. It is flexible, transparent when not activated, and contains no mechanical, electromagnetic, or hydraulic components, resulting in a compact form factor (a 2mm thin touchscreen layer for our prototype). The activated areas can be morphed freely and continuously, without being limited to fixed, predefined shapes. GelTouch consists of a poly(N-isopropylacrylamide) gel layer which alters its viscoelasticity when activated by applying heat (&gt;32 C). We present three different activation techniques: 1) Indium Tin Oxide (ITO) as a heating element that enables tactile feedback through individually addressable taxels; 2) predefined tactile areas of engraved ITO, that can be layered and combined; 3) complex arrangements of resistance wire that create thin tactile edges. We present a tablet with 6x4 tactile areas, enabling a tactile numpad, slider, and thumbstick. We show that the gel is up to 25 times stiffer when activated and that users detect tactile features reliably (94.8%).","mainEntityOfPage":{"@type":"WebPage","@id":"http://localhost:4000/publications/2015-geltouch.html"},"@context":"https://schema.org"}</script>
5656
<!-- End Jekyll SEO tag -->
5757

5858
</head>

0 commit comments

Comments
 (0)