Human Generated Data

Title

Untitled (man holding a tray of teacups)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8458

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man holding a tray of teacups)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8458

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.3
Apparel 99.3
Person 98.6
Human 98.6
Person 97.8
Person 88
Shelf 76.6
Coat 70.8
Overcoat 70.3
Sleeve 67.9
Person 67.7
Text 58.4
Furniture 58.2
Long Sleeve 57.1
Pants 56.8
Hair 55

Clarifai
created on 2023-10-26

people 99.7
monochrome 99
adult 98.7
two 97.2
wear 95.8
woman 95.7
man 95.6
group together 95.3
group 92.8
recreation 91.4
three 91.2
actress 90.1
four 89.8
one 87.8
indoors 86.6
facial expression 85.6
outfit 85.2
actor 84.9
music 84.1
furniture 81.4

Imagga
created on 2022-01-15

shop 39.4
mercantile establishment 25.5
fashion 21.8
person 19
black 18.1
adult 17.5
place of business 17
tobacco shop 16.8
people 15.6
city 15
urban 14.8
man 14.1
portrait 13.6
old 13.2
human 12.7
shoe shop 12.6
interior 12.4
sexy 12
locker 11.1
grunge 11.1
shopping 11
window 10.9
clothing 10.8
vintage 10.7
posing 10.7
male 10.6
one 10.4
business 10.3
device 10.3
women 10.3
inside 10.1
model 10.1
attractive 9.8
style 9.6
life 9.5
lifestyle 9.4
buy 9.4
newspaper 9.1
art 9.1
lady 8.9
indoors 8.8
product 8.7
establishment 8.4
modern 8.4
house 8.3
sale 8.3
fastener 8.1
looking 8
face 7.8
architecture 7.8
casual 7.6
statue 7.6
building 7.5
dark 7.5
indoor 7.3
girls 7.3
sensuality 7.3
music 7.2
body 7.2
home 7.2
hair 7.1
smile 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.9
person 96.4
black and white 94.2
clothing 88.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 92.6%
Calm 99.7%
Surprised 0.2%
Sad 0%
Angry 0%
Disgusted 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 99%
Calm 70.1%
Sad 12.8%
Happy 7%
Confused 3.9%
Surprised 2.7%
Disgusted 2.1%
Angry 1.1%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Male, 99.1%
Calm 80.3%
Happy 15.4%
Confused 1.4%
Sad 1.1%
Surprised 0.6%
Angry 0.4%
Disgusted 0.4%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Categories

Text analysis

Amazon

14497
14497.