Human Generated Data

Title

Untitled (couple toasting after getting married)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11653

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple toasting after getting married)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 90.5
Clothing 75.4
Apparel 75.4
Face 71.2
Food 69.9
Meal 69.9
Furniture 67.3
Dish 65.7
Restaurant 60.8
Couch 58.3
Sitting 58.3
Text 56.6
Monitor 55.3
Electronics 55.3
Screen 55.3
Display 55.3
Female 55.2
Person 53.4

Imagga
created on 2022-01-15

man 34.3
person 32.9
laptop 30.6
people 28.4
male 28.4
scholar 24.5
adult 24
computer 22.6
intellectual 19.5
business 19.4
musical instrument 18
portable computer 17.7
sitting 17.2
working 16.8
office 16.7
job 15.9
black 15.7
personal computer 15.6
notebook 15.6
room 15
chair 14.7
work 14.4
businessman 12.4
looking 12
home 12
women 11.9
professional 11.6
lifestyle 11.6
worker 11.6
men 11.2
newspaper 11.1
suit 11
light 10.7
indoors 10.5
couple 10.5
technology 10.4
wind instrument 10.2
silhouette 9.9
digital computer 9.9
executive 9.8
portrait 9.7
corporate 9.4
window 9.4
dark 9.2
studio 9.1
attractive 9.1
fashion 9
accordion 8.7
keyboard instrument 8.6
businesspeople 8.5
expression 8.5
senior 8.4
monitor 8.4
human 8.2
alone 8.2
keyboard 8.2
happy 8.1
music 8.1
casual 7.6
reading 7.6
relaxation 7.5
mature 7.4
occupation 7.3
relaxing 7.3
success 7.2
sexy 7.2
hair 7.1
love 7.1
interior 7.1
seat 7
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 99.1
black and white 97.1
man 96.1
text 91.9
musical instrument 91.1
monochrome 87.6
guitar 83
black 66.2
concert 62.2
clothing 61.5

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 77.4%
Fear 46%
Sad 17%
Happy 11.2%
Confused 7%
Calm 6.3%
Angry 4.7%
Disgusted 4%
Surprised 3.9%

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a window 72.1%
a man sitting in front of a window 64.2%
a man standing in front of a laptop 64.1%

Text analysis

Amazon

830N3330
ESAB EVERY 830N3330
ESAB
EVERY

Google

32A9
YTAA2
32A9 YTAA2 830M3330
830M3330