Human Generated Data

Title

Untitled (man seated on floor with two women)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10534

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man seated on floor with two women)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10534

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.8
Human 98.8
Chair 98.5
Furniture 98.5
Clothing 97
Apparel 97
Person 96.1
Face 87.5
Shoe 87.2
Footwear 87.2
Person 81.2
Female 76.1
Portrait 65.2
Photography 65.2
Photo 65.2
People 64.5
Woman 57.9
Hair 57.3
Heel 57.2

Clarifai
created on 2023-10-25

people 99.9
group 98.5
adult 97.7
woman 97.7
monochrome 97.5
man 97.4
group together 95.8
music 92.6
facial expression 90.9
musician 90.3
sitting 89.4
administration 89.1
wear 88.5
indoors 88.5
sit 87.7
recreation 87.5
singer 87.5
furniture 87.1
child 85.2
portrait 85.1

Imagga
created on 2022-01-09

television 58.6
telecommunication system 41.1
person 37.8
people 28.4
sport 23.8
athlete 22.5
silhouette 22.3
player 22.3
training 21.2
man 20.8
event 20.3
stadium 19.5
adult 19.3
male 19.1
patient 18.4
skill 18.3
crowd 18.2
muscular 18.1
competition 17.4
field 16.7
lights 16.7
world 16.7
cheering 16.7
audience 16.6
championship 16.5
flag 16.5
match 16.4
patriotic 16.3
case 16.1
nation 16.1
nighttime 15.7
symbol 15.5
men 14.6
sick person 14.4
black 13.8
icon 13.5
monitor 13.2
park 13.2
room 12.8
versus 12.8
equipment 12.7
happy 12.5
portrait 12.3
shorts 11.7
fight 11.6
sitting 11.2
youth 11.1
model 10.9
lifestyle 10.8
one 10.5
women 10.3
teacher 10.1
relax 10.1
fun 9.7
studio 9.1
business 9.1
pretty 9.1
fashion 9
team 9
lady 8.9
electronic equipment 8.9
punch 8.8
gloves 8.7
happiness 8.6
dangerous 8.6
smile 8.5
newspaper 8.5
attractive 8.4
against 8.3
human 8.2
freedom 8.2
exercise 8.2
style 8.2
group 8.1
ball 8
businessman 7.9
uppercut 7.9
jab 7.9
ropes 7.9
bright 7.9
boxing 7.8
kick 7.8
block 7.6
ring 7.5
enjoy 7.5
dark 7.5
glowing 7.4
professional 7.4
looking 7.2
body 7.2
classroom 7.2
shiny 7.1
indoors 7
modern 7
vibrant 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.2
musical instrument 93.4
black and white 93.3
clothing 93.2
person 91.8
guitar 91.6
man 88.2
concert 57.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 63.6%
Calm 81.5%
Sad 14.5%
Happy 1.8%
Surprised 0.9%
Disgusted 0.4%
Fear 0.3%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 47-53
Gender Male, 94.5%
Happy 64.8%
Sad 19.7%
Surprised 11%
Fear 2.3%
Confused 0.6%
Calm 0.6%
Disgusted 0.5%
Angry 0.4%

AWS Rekognition

Age 49-57
Gender Male, 65.5%
Happy 82%
Surprised 6.5%
Sad 5%
Calm 4%
Disgusted 0.8%
Fear 0.7%
Confused 0.6%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Shoe 87.2%

Captions

Text analysis

Amazon

20056
95002
20056.
rst

Google

20
bnant
O056.
200S6.
20 056 bnant O056. 200S6.
056