Human Generated Data

Title

"Packaged House" System, 1942-1952: Construction

Date

c. 1943

People

Artist: William F. Karsten, American

Architect: Konrad Wachsmann, German 1901 - 1980

Architect: Walter Gropius, German 1883 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Walter Gropius, BRGA.97.171

Human Generated Data

Title

"Packaged House" System, 1942-1952: Construction

People

Artist: William F. Karsten, American

Architect: Konrad Wachsmann, German 1901 - 1980

Architect: Walter Gropius, German 1883 - 1969

Date

c. 1943

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Human 99.6
Person 99.6
Person 99.2
Person 99
Person 97.8
Shoe 97.4
Footwear 97.4
Clothing 97.4
Apparel 97.4
Wood 96.2
Handrail 95.5
Banister 95.5
Shoe 92.2
Carpenter 91
Railing 82
Construction 72.1
Plywood 59.9
Brick 59.3
Shorts 57.4
Worker 55.6
Flooring 55.3

Imagga
created on 2022-05-28

barrier 70.7
obstruction 52.6
structure 38.5
people 25.7
man 21.5
adult 20.7
male 20.6
sky 19.8
lifestyle 19.5
outdoors 18.7
outside 18
person 16.9
outdoor 15.3
sport 15.3
fun 14.2
leisure 14.1
silhouette 14.1
railing 13.9
worker 13.4
city 13.3
action 13
active 12.8
building 12.8
happy 12.5
balcony 12.3
travel 12
park 11.9
women 11.9
recreation 11.7
portrait 11.6
activity 11.6
one 11.2
summer 10.9
sunset 10.8
jumping 10.6
business 10.3
motion 10.3
day 10.2
architecture 10.2
water 10
pretty 9.8
boy 9.6
work 9.5
construction 9.4
equipment 9.1
human 9
lady 8.9
board 8.9
urban 8.7
standing 8.7
sea 8.6
athlete 8.5
skate 8.4
beach 8.4
ocean 8.4
attractive 8.4
teen 8.3
teenager 8.2
healthy 8.2
sun 8.2
fitness 8.1
river 8
smiling 8
cute 7.9
sand 7.9
couple 7.8
happiness 7.8
skateboard 7.7
youth 7.7
casual 7.6
professional 7.6
hobby 7.6
relax 7.6
fashion 7.5
holding 7.4
color 7.2
sexy 7.2
hair 7.1
smile 7.1
modern 7
together 7

Google
created on 2022-05-28

Microsoft
created on 2022-05-28

outdoor 95.8
man 91.7
black and white 87.7
person 82.1
clothing 58.1
wood 26.6

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Disgusted 46.9%
Angry 28.8%
Sad 17.9%
Surprised 7.5%
Fear 6.5%
Calm 3.1%
Confused 1%
Happy 0.3%

AWS Rekognition

Age 50-58
Gender Male, 99.5%
Calm 92.2%
Surprised 8.1%
Fear 5.9%
Disgusted 2.8%
Sad 2.3%
Confused 0.5%
Angry 0.4%
Happy 0.2%

AWS Rekognition

Age 50-58
Gender Male, 99.5%
Calm 84.9%
Surprised 7%
Angry 6.2%
Fear 6.1%
Sad 2.9%
Happy 2.3%
Confused 1.3%
Disgusted 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 97.4%

Captions

Microsoft

a man standing on top of a wooden ramp 90.6%
a man doing a trick on a skateboard 46.9%
a man doing a trick on a rail 46.8%