Human Generated Data

Title

Untitled (couple dancing in brewing company while surrounded by people)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9200

Human Generated Data

Title

Untitled (couple dancing in brewing company while surrounded by people)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Apparel 99.5
Clothing 99.5
Person 99
Person 98.6
Person 98.4
Person 92
Person 91.7
Person 86.8
Coat 79.9
Sleeve 74.9
Suit 71.1
Overcoat 71.1
Pants 68.4
Leisure Activities 66.3
Long Sleeve 66
Building 65
Face 65
Photography 65
Portrait 65
Photo 65
Stage 63.7
Dance Pose 57.4
Scientist 55.6

Imagga
created on 2022-01-23

man 36.9
people 29
business 24.3
male 23.5
cricket equipment 21.1
businessman 20.3
adult 19.3
sports equipment 18.9
person 18.8
wicket 17.4
office 16.6
men 15.4
smiling 15.2
happy 15
city 15
corporate 14.6
outdoors 14.2
sport 13.7
equipment 13.4
group 12.9
work 12.6
building 12.5
together 12.3
worker 11.8
businesswoman 11.8
team 11.6
lifestyle 11.6
working 11.5
smile 11.4
walking 11.4
couple 11.3
boy 11.3
house 10.9
meeting 10.4
portrait 10.3
professional 10.3
women 10.3
teamwork 10.2
day 10.2
two 10.2
street 10.1
world 10
travel 9.9
family 9.8
40s 9.7
urban 9.6
home 9.6
standing 9.6
architecture 9.5
sitting 9.4
back 9.2
room 9
black 9
cricket bat 9
success 8.8
job 8.8
looking 8.8
active 8.6
happiness 8.6
communication 8.4
executive 8.3
children 8.2
love 7.9
education 7.8
30s 7.7
employee 7.6
wife 7.6
leisure 7.5
holding 7.4
ball 7.4
vacation 7.4
child 7.3
crutch 7.3
suit 7.2
percussion instrument 7.1
indoors 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

window 95.3
clothing 90.5
person 90.4
text 83.8
black and white 78.9
monochrome 52.3

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 98.3%
Happy 96.1%
Sad 1.9%
Confused 0.8%
Calm 0.6%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 59.5%
Calm 95.7%
Confused 1.6%
Disgusted 1.2%
Surprised 0.5%
Happy 0.5%
Sad 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Male, 99.6%
Sad 46.3%
Happy 32.9%
Calm 9.2%
Fear 3.6%
Confused 3.3%
Disgusted 2%
Surprised 1.5%
Angry 1.2%

AWS Rekognition

Age 40-48
Gender Female, 72.7%
Happy 48.7%
Calm 43%
Sad 3.4%
Surprised 2.2%
Angry 0.9%
Confused 0.7%
Disgusted 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing in front of a window 91.4%
a man standing in front of a window 89.2%
a group of people in front of a window 89.1%

Text analysis

Amazon

13551
KODVK--EVEETX

Google

st 3 S S J KODVK-
S
st
3
KODVK-
J