Human Generated Data

Title

Untitled (photographer taking picture of family posed in living room)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10655

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (photographer taking picture of family posed in living room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10655

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99
Human 99
Person 98.5
Person 97
Accessory 95.8
Tie 95.8
Accessories 95.8
Shoe 92.5
Footwear 92.5
Clothing 92.5
Apparel 92.5
Shoe 86.7
Person 80.8
People 80.1
Chair 78.1
Furniture 78.1
Face 66.5
Text 60.8
Crowd 59.2
Handrail 59.1
Banister 59.1
Suit 56.9
Coat 56.9
Overcoat 56.9
Shorts 55.7
Musician 55.5
Musical Instrument 55.5
Female 55.3
Girl 55.3

Clarifai
created on 2023-10-26

people 99.9
group 97.4
adult 97.3
woman 96.9
man 93.5
child 93.1
group together 91.7
wear 90.9
monochrome 90.7
gown (clothing) 90.2
three 88.1
several 88
two 87.5
many 86.2
administration 86
kimono 85.9
actress 85.6
leader 83.4
four 83.3
education 82.8

Imagga
created on 2022-01-15

newspaper 48.7
product 38.6
creation 29.3
man 26.2
people 23.4
person 21.5
adult 17.8
male 16.3
men 14.6
women 13.4
room 13.4
business 13.4
interior 13.3
indoors 12.3
lifestyle 12.3
office 12.2
businessman 11.5
travel 10.6
standing 10.4
home 10.4
portrait 10.3
happy 10
smile 10
lady 9.7
crutch 9.7
smiling 9.4
city 9.1
old 9
style 8.9
chair 8.9
job 8.8
life 8.6
happiness 8.6
two 8.5
black 8.4
health 8.3
fashion 8.3
back 8.3
occupation 8.2
work 8.2
transportation 8.1
working 7.9
professional 7.9
day 7.8
couple 7.8
walk 7.6
stick 7.5
house 7.5
staff 7.5
human 7.5
outdoors 7.5
holding 7.4
inside 7.4
group 7.2
color 7.2
holiday 7.2
world 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.5
person 78.6
clothing 75.4
man 55.9
footwear 50.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 97.5%
Happy 58.4%
Calm 36.2%
Surprised 2.9%
Confused 0.7%
Sad 0.7%
Disgusted 0.4%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 24-34
Gender Male, 70.6%
Calm 99.9%
Sad 0%
Surprised 0%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Female, 99.6%
Calm 96.9%
Sad 1.3%
Surprised 1%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 7-17
Gender Male, 99.1%
Calm 97.6%
Sad 0.8%
Happy 0.6%
Angry 0.4%
Surprised 0.2%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Female, 93.8%
Calm 99.7%
Happy 0.2%
Sad 0%
Surprised 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Tie 95.8%
Shoe 92.5%
Chair 78.1%

Text analysis

Amazon

35166-A

Google

35166-A 35166-A.
35166-A
35166-A.