Human Generated Data

Title

Untitled (street scene in Mörbisch, Austria)

Date

c. 1890 - c. 1915

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4244

Human Generated Data

Title

Untitled (street scene in Mörbisch, Austria)

People

Artist: Unidentified Artist,

Date

c. 1890 - c. 1915

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.5
Person 99.5
Person 99.4
Walkway 99.3
Path 99.3
Person 99.1
City 98.2
Street 98.2
Road 98.2
Urban 98.2
Town 98.2
Building 98.2
Flagstone 98.2
Person 95.9
Person 94.8
Person 94.2
Sidewalk 91.8
Pavement 91.8
Nature 89.5
Person 88.8
Person 85.7
Person 74.2
Slate 70.2
Outdoors 66.8
Alleyway 66.5
Alley 66.5
Cobblestone 56.6
Person 54.1

Imagga
created on 2022-01-29

swab 87.9
cleaning implement 71.2
travel 23.9
building 23.8
city 23.3
architecture 22
old 21.6
street 20.2
tourism 18.1
house 15.1
history 14.3
town 13.9
ancient 13.8
window 13
historical 12.2
wall 11.6
stone 11
urban 10.5
destination 10.3
culture 10.2
maypole 10
laundry 9.8
sky 9.6
buildings 9.4
meat hook 9.3
tree 9.2
road 9
landmark 9
outdoors 8.9
home 8.8
arch 8.7
antique 8.6
post 8.6
sea 8.6
people 8.4
world 8.3
historic 8.2
vacation 8.2
water 8
cobblestone 7.9
colorful 7.9
holiday 7.9
hook 7.9
scene 7.8
door 7.6
art 7.1
trees 7.1
summer 7.1
day 7.1
support 7

Google
created on 2022-01-29

Black 89.7
Sky 88.6
Black-and-white 86
Building 85.4
Style 84.1
Art 80.4
Adaptation 79.3
Monochrome photography 77.9
Tints and shades 77.3
Monochrome 76.4
Beauty 75.4
Road 75.1
Alley 75.1
House 73.8
City 72.2
Street 69.3
Room 68.5
Visual arts 66.5
History 65.8
Town 65.6

Microsoft
created on 2022-01-29

house 95.7
way 92.1
black and white 90
scene 83
sky 81.9
text 80
white 66.6
building 63.8
sidewalk 47.7
old 41
street 34.7
road 25.8

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 62.8%
Calm 43.2%
Sad 38.5%
Angry 10.5%
Happy 3.3%
Confused 2.2%
Fear 0.9%
Surprised 0.8%
Disgusted 0.7%

AWS Rekognition

Age 24-34
Gender Female, 81.7%
Calm 86.8%
Fear 5.3%
Happy 4.1%
Sad 1.8%
Angry 0.7%
Disgusted 0.6%
Surprised 0.4%
Confused 0.3%

AWS Rekognition

Age 0-6
Gender Male, 97.8%
Happy 34.4%
Calm 25.4%
Sad 20.2%
Angry 10.1%
Confused 4%
Disgusted 2.7%
Surprised 1.7%
Fear 1.4%

AWS Rekognition

Age 18-24
Gender Male, 94.7%
Confused 35.5%
Calm 30.6%
Happy 26.4%
Sad 2.3%
Surprised 1.9%
Disgusted 1.6%
Angry 1.5%
Fear 0.3%

AWS Rekognition

Age 18-26
Gender Female, 82.5%
Calm 99.7%
Angry 0.1%
Disgusted 0.1%
Sad 0%
Happy 0%
Surprised 0%
Confused 0%
Fear 0%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people standing in front of a building 95%
a group of people that are standing in front of a building 91.9%
a group of people standing outside of a building 91.8%