Human Generated Data

Title

Untitled (two women in costumes and bowties performing on stage)

Date

c. 1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9548

Human Generated Data

Title

Untitled (two women in costumes and bowties performing on stage)

People

Artist: Martin Schweig, American 20th century

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.2
Human 99.2
Person 99.1
Shorts 92.2
Clothing 92.2
Apparel 92.2
Stage 86.9
Pedestrian 75.5
People 69.6
Leisure Activities 68.4
Urban 60.7
Building 60.1
Female 60.1
Drawing 60
Art 60
Architecture 57.7

Imagga
created on 2022-01-28

runner 55.7
athlete 45.8
contestant 31.9
person 22
business 18.8
newspaper 18.6
global 17.3
world 17.2
art 15.9
people 15.6
man 15.4
grunge 15.3
design 15.2
sport 15.1
map 14.8
product 14.6
male 14.2
success 13.7
finance 13.5
silhouette 13.2
creation 13.2
drawing 12.8
chart 12.4
international 12.4
communication 11.7
financial 11.6
corporate 11.2
men 11.2
businessman 10.6
sketch 10.6
diagram 10.5
old 10.4
technology 10.4
symbol 10.1
black 9.7
graphic 9.5
colorful 9.3
modern 9.1
active 9
group 8.9
office 8.8
growth 8.8
wall 8.7
progress 8.7
graph 8.6
youth 8.5
money 8.5
presentation 8.4
network 8.3
globe 8.3
texture 8.3
vintage 8.3
team 8.1
market 8
portrait 7.8
web 7.6
hand 7.6
power 7.6
poster 7.5
daily 7.5
outdoors 7.5
glowing 7.4
artwork 7.3
digital 7.3
color 7.2
lifestyle 7.2
futuristic 7.2
idea 7.1
creative 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 99.3
person 93.7
clothing 85.7
footwear 74.1

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Disgusted 50.5%
Happy 25.9%
Surprised 13.2%
Angry 3.9%
Calm 2.9%
Sad 2%
Fear 0.8%
Confused 0.7%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Happy 97.1%
Disgusted 1%
Calm 0.4%
Angry 0.4%
Surprised 0.3%
Confused 0.3%
Sad 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a person standing in front of a building 68.9%
a person standing next to a building 60.7%
a person standing in front of a brick building 53.7%

Text analysis

Amazon

KODAK--SAA-EITW

Google

-
XAGON
MJI7-- YT37A°2 - - XAGON
MJI7--
YT37A°2