Human Generated Data

Title

Untitled (man inspecting couple's luggage at customs counter, Miami International Airport)

Date

1951, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12226

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man inspecting couple's luggage at customs counter, Miami International Airport)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.5
Human 99.5
Person 96.8
Person 93.6
Person 88.9
Apparel 85.8
Clothing 85.8
Furniture 80
Sitting 75.3
Face 61.7
Wood 58.6
Flooring 56.3

Clarifai
created on 2019-11-16

people 99.8
group 97.9
adult 97.3
group together 96.3
room 95.1
many 93.8
man 93.4
furniture 93
two 91
music 90.8
one 90.4
several 89.5
woman 88.6
monochrome 88.3
wear 86.9
leader 86.8
movie 86.5
administration 85.5
three 83
musician 82.8

Imagga
created on 2019-11-16

people 31.8
man 31.5
television 30.6
person 25.9
male 22
telecommunication system 21.5
adult 21
lifestyle 18
happy 17.5
indoors 16.7
group 16.1
indoor 14.6
business 14.6
room 14.5
office 14.2
men 13.7
computer 13.2
smiling 13
leisure 12.4
working 12.4
friends 12.2
cheerful 12.2
smile 12.1
portrait 11.6
handsome 11.6
interior 11.5
classroom 11.4
professional 11.2
fun 11.2
sitting 11.2
love 11
work 11
happiness 11
music 10.9
chair 10.9
attractive 10.5
technology 10.4
equipment 10.3
women 10.3
casual 10.2
worker 9.9
pretty 9.8
black 9.6
party 9.4
laptop 9.3
model 9.3
entertainment 9.2
teenager 9.1
lady 8.9
night 8.9
together 8.7
couple 8.7
clothing 8.6
meeting 8.5
fashion 8.3
businesswoman 8.2
restaurant 8.2
style 8.1
team 8.1
job 8
businessman 7.9
education 7.8
modern 7.7
center 7.7
youth 7.7
studio 7.6
communication 7.5
company 7.4
table 7.2
blond 7.2
home 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 95.6
clothing 93.2
black and white 73.2
cartoon 72.6
text 71
man 69.3
human face 58.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-66
Gender Male, 54.6%
Angry 45.1%
Happy 45.1%
Disgusted 45%
Surprised 45%
Fear 45%
Sad 45.3%
Confused 45.1%
Calm 54.5%

AWS Rekognition

Age 30-46
Gender Male, 54.6%
Disgusted 45.1%
Confused 45.1%
Fear 45.1%
Calm 50%
Happy 48.3%
Surprised 45.9%
Angry 45.3%
Sad 45.2%

AWS Rekognition

Age 26-40
Gender Female, 52.1%
Fear 45.1%
Happy 45.4%
Confused 45.1%
Calm 49.3%
Disgusted 45%
Sad 49.2%
Angry 45.8%
Surprised 45.1%

AWS Rekognition

Age 31-47
Gender Male, 52.6%
Happy 45.1%
Angry 45.8%
Disgusted 45.1%
Confused 45.2%
Sad 46.1%
Calm 51.9%
Surprised 45.6%
Fear 45.3%

AWS Rekognition

Age 6-16
Gender Female, 50.6%
Sad 48.3%
Disgusted 48.7%
Happy 45.4%
Confused 45.2%
Fear 45.1%
Angry 45.5%
Calm 46.7%
Surprised 45%

AWS Rekognition

Age 16-28
Gender Female, 52%
Confused 45.1%
Surprised 45.1%
Happy 45%
Calm 46.7%
Disgusted 45%
Sad 49.8%
Angry 48%
Fear 45.2%

Microsoft Cognitive Services

Age 55
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people in a room 90.1%
a group of people standing in front of a television 69.7%
a group of people standing in a room 69.6%

Text analysis

Google

37
37