Human Generated Data

Title

Untitled (man taking picture of posing family)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10656

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man taking picture of posing family)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.3
Person 99.3
Person 98.9
Apparel 98.9
Clothing 98.9
Person 95.1
Shorts 87
Shoe 85
Footwear 85
Person 84.4
Face 83.3
People 79.5
Shoe 79.1
Female 78.8
Sleeve 75.5
Furniture 74.8
Long Sleeve 71.6
Girl 63.8
Coat 62.8
Suit 62.8
Overcoat 62.8
Pants 61.6
Woman 60.8
Text 60.7
Flooring 60.6
Floor 59.4
Monitor 57.3
Display 57.3
Screen 57.3
Electronics 57.3
Indoors 57.1
Child 56.8
Kid 56.8
Play 56.5

Imagga
created on 2022-01-15

man 39
people 28.4
room 27.2
person 27.1
male 26.9
office 25.6
adult 25.6
business 25.5
teacher 24
businessman 23.8
professional 23.1
classroom 20.4
sitting 18
men 18
computer 16.9
laptop 16.8
executive 16
indoors 15.8
corporate 15.5
working 15
educator 14.9
smiling 14.5
job 14.1
newspaper 14.1
world 13.6
women 13.4
modern 12.6
happy 12.5
lifestyle 12.3
meeting 12.2
businesswoman 11.8
communication 11.8
portrait 11.6
interior 11.5
black 11.4
couple 11.3
group 11.3
table 11.1
work 11
suit 10.8
product 10.8
businesspeople 10.4
manager 10.2
home 9.6
boss 9.6
career 9.5
desk 9.4
teamwork 9.3
phone 9.2
alone 9.1
indoor 9.1
building 9
technology 8.9
conference 8.8
youth 8.5
casual 8.5
two 8.5
creation 8.4
back 8.3
team 8.1
success 8
smile 7.8
chair 7.8
face 7.8
life 7.8
case 7.7
pretty 7.7
old 7.7
talking 7.6
fashion 7.5
holding 7.4
style 7.4
occupation 7.3
board 7.3
looking 7.2
handsome 7.1
family 7.1
happiness 7
travel 7

Microsoft
created on 2022-01-15

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 67.4%
Calm 99.8%
Sad 0.1%
Happy 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 34-42
Gender Female, 78.3%
Calm 72.3%
Happy 22.7%
Surprised 1.5%
Sad 1.4%
Confused 1.2%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 11-19
Gender Female, 96.1%
Calm 78.2%
Happy 16.7%
Surprised 2.7%
Disgusted 0.7%
Fear 0.6%
Sad 0.5%
Angry 0.4%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Female, 88.6%
Calm 57.6%
Happy 38.9%
Surprised 1.7%
Sad 1.1%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Shoe 85%

Captions

Microsoft

a group of people in a room 84.2%

Text analysis

Amazon

35166
all
KODAK-SELA

Google

人 山 3 166
3
166