Human Generated Data

Title

Untitled (waitress taking the order of three women in a booth at a diner)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4889

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (waitress taking the order of three women in a booth at a diner)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4889

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.6
Person 99.6
Person 99.5
Person 99.3
Interior Design 91.1
Indoors 91.1
Art 81.6
Painting 64
Clothing 59.6
Apparel 59.6
Crowd 55.8

Clarifai
created on 2023-10-26

people 99.7
group 97.6
group together 96.2
woman 95.3
man 94.8
recreation 94.5
adult 94.1
war 93.5
monochrome 89.7
child 89.5
music 86.4
sit 83.9
guitar 82.1
musician 81.1
enjoyment 80.9
many 80.2
vehicle 78.6
boy 78.4
several 78.2
sitting 76.8

Imagga
created on 2022-01-23

sax 59.3
stage 41.1
musical instrument 37.7
wind instrument 35.6
person 31.9
platform 26.2
accordion 23.8
people 23.4
man 22.8
male 22
keyboard instrument 19.2
adult 18
player 17.3
music 17.1
concert 16.5
musician 16.4
silhouette 15.7
musical 15.3
symbol 14.8
guitar 13.9
black 13.8
bass 13.8
group 13.7
rock 13
lights 13
event 12.9
classroom 12.5
crowd 12.5
businessman 12.4
business 12.1
cheering 11.8
audience 11.7
patriotic 11.5
performance 11.5
teacher 11.4
modern 11.2
flag 11
portrait 11
blackboard 10.8
nighttime 10.8
stadium 10.7
design 10.7
happy 10.7
vibrant 10.5
nation 10.4
men 10.3
icon 10.3
youth 10.2
school 9.9
band 9.7
class 9.6
computer 9.6
brass 9.6
education 9.5
training 9.2
glowing 9.2
star 9.2
dark 9.2
competition 9.1
studio 9.1
laptop 9.1
art 9.1
board 9
sky 8.9
room 8.9
job 8.8
skill 8.7
muscular 8.6
bright 8.6
singer 8.4
field 8.4
sport 8.2
performer 8.2
style 8.2
working 8
shiny 7.9
serve 7.8
play 7.8
match 7.7
professional 7.7
power 7.6
human 7.5
electric 7.5
executive 7.5
instrument 7.4
holding 7.4
light 7.4
lady 7.3
playing 7.3
protection 7.3
student 7.2
lifestyle 7.2
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.2
clothing 88.1
person 85.4
black and white 75.4
man 66
old 43.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Sad 77.6%
Happy 8.8%
Calm 6.5%
Surprised 3.3%
Confused 1.5%
Fear 1.3%
Disgusted 0.6%
Angry 0.4%

AWS Rekognition

Age 37-45
Gender Male, 92.3%
Happy 66.9%
Surprised 24.2%
Calm 2.6%
Fear 2.6%
Sad 1.4%
Disgusted 1%
Confused 0.8%
Angry 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Surprised 53.4%
Calm 27.2%
Happy 9.3%
Fear 6.6%
Disgusted 1.8%
Angry 0.8%
Confused 0.5%
Sad 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

H16007.
H 16007.

Google

HI6007. HI6007.
HI6007.