Human Generated Data

Title

Untitled (bar tenders at wedding reception)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8596

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (bar tenders at wedding reception)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99
Clothing 77.9
Apparel 77.9
Face 74.9
Alcohol 67.6
Beverage 67.6
Drink 67.6
Room 58.3
Indoors 58.3
Bottle 57.1
Overcoat 56.6
Coat 56.6
Text 56.6

Imagga
created on 2022-01-09

man 37.6
male 30.5
people 29
person 28.7
adult 22.9
business 19.4
black 19.3
men 17.2
musical instrument 16
office 15.5
fashion 15.1
portrait 14.2
barbershop 13.6
shop 13.4
handsome 13.4
businessman 13.2
bartender 13.2
indoors 13.2
lifestyle 13
casual 12.7
group 12.1
professional 11.6
suit 11.6
hand 11.4
one 11.2
attractive 11.2
corporate 10.3
silhouette 9.9
modern 9.8
room 9.7
looking 9.6
couple 9.6
home 9.6
happy 9.4
mercantile establishment 9.1
city 9.1
indoor 9.1
music 9.1
businesswoman 9.1
human 9
style 8.9
life 8.9
urban 8.7
world 8.7
hands 8.7
smiling 8.7
window 8.4
pretty 8.4
elegance 8.4
chair 8.2
playing 8.2
percussion instrument 8.1
computer 8.1
team 8.1
family 8
women 7.9
executive 7.9
smile 7.8
happiness 7.8
boy 7.8
face 7.8
device 7.7
career 7.6
clothing 7.5
meeting 7.5
fun 7.5
alone 7.3
worker 7.2
guitar 7.1
love 7.1
musician 7.1
job 7.1
working 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.3
person 96.6
clothing 95.2
black and white 90.4
man 85.1

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 100%
Sad 82.7%
Calm 6.3%
Disgusted 4.5%
Confused 3.3%
Angry 1.2%
Fear 0.9%
Surprised 0.8%
Happy 0.3%

AWS Rekognition

Age 51-59
Gender Male, 99.8%
Happy 98.3%
Confused 0.8%
Surprised 0.4%
Sad 0.1%
Calm 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing next to a window 77.8%
a man standing in front of a window 77.7%
a man and a woman standing in front of a window 60.5%

Text analysis

Amazon

17676.
BM
.9L9L1
(FFCallum

Google

17676. Coa 17676.
17676.
Coa