Human Generated Data

Title

Untitled (two women around table at reception under a tent)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11673

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women around table at reception under a tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11673

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.7
Person 99.7
Person 97.4
Clothing 86.7
Apparel 86.7
Meal 78.9
Food 78.9
Female 78.9
Woman 65.5
People 63.8
Text 63.7
Face 63.7
Plant 59.6
Girl 58.8
Spoke 58.5
Machine 58.5
Performer 55.1
Person 45.5

Clarifai
created on 2023-10-26

people 99.7
man 98
two 97.7
adult 97.7
woman 95.8
group 94.3
music 93.4
illustration 93.3
monochrome 90.7
recreation 90.4
wear 88.3
three 87.3
guitar 86.4
retro 85.4
musician 84.4
sitting 80.8
group together 79.8
art 79.4
child 79
enjoyment 75.8

Imagga
created on 2022-01-15

man 25.5
person 24.9
sitting 21.5
people 21.2
adult 19.1
model 17.9
male 17.8
portrait 16.8
fashion 16.6
lifestyle 15.9
smiling 15.2
happy 15
women 15
vehicle 14.2
work 13.3
wheeled vehicle 12.4
sexy 12
professional 11.7
outdoors 11.4
clothing 11.3
attractive 11.2
men 11.2
youth 11.1
worker 10.7
job 10.6
working 10.6
pretty 10.5
style 10.4
sport 10.4
smile 10
newspaper 9.8
lady 9.7
fun 9.7
business 9.7
one 9.7
indoors 9.7
couple 9.6
senior 9.4
casual 9.3
music 9
looking 8.8
motor scooter 8.6
relax 8.4
black 8.4
car 8.4
hand 8.4
human 8.2
technology 8.2
life 8.1
dress 8.1
equipment 8.1
success 8
computer 8
art 8
home 8
product 7.8
play 7.8
board 7.7
expression 7.7
skateboard 7.6
joy 7.5
laptop 7.3
sensuality 7.3
exercise 7.3
moped 7.2
body 7.2
team 7.2
hair 7.1
face 7.1
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.8
outdoor 93.4
clothing 92.7
black and white 89
person 87.7
woman 81.7
human face 79.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 70.9%
Calm 94.7%
Sad 1.8%
Confused 1.4%
Surprised 0.9%
Angry 0.3%
Happy 0.3%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 41-49
Gender Male, 93.5%
Calm 64.4%
Happy 15.8%
Confused 9.1%
Fear 2.9%
Angry 2.6%
Sad 2.2%
Surprised 2%
Disgusted 1.1%

Feature analysis

Amazon

Person 99.7%

Categories

Captions

Text analysis

Amazon

19540.
17540.

Google

17540. 7540. HAGOX-YT3RA2-
17540.
7540.
HAGOX-YT3RA2-