Human Generated Data

Title

Untitled (man holding two bluebirds in yard with woman)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10704

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man holding two bluebirds in yard with woman)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10704

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 98.2
Face 88.6
Clothing 87.9
Apparel 87.9
Dress 85
Female 80.1
Plant 75.5
Texture 73.2
Portrait 71.7
Photography 71.7
Photo 71.7
Tree 67.8
People 66
Outdoors 65.4
Grass 64.4
Leisure Activities 64.2
Woman 60.1
Sleeve 60
Girl 58.9
Art 56.6
Yard 56
Nature 56

Clarifai
created on 2023-10-26

people 99.9
canine 99
adult 98.7
two 98.5
wear 98.3
woman 97.3
dog 96.7
man 95.3
group 95
three 94.6
leader 93.3
one 91.9
four 91.8
child 90.8
elderly 87.9
actress 87.8
outfit 85.6
group together 84.7
several 83.7
monochrome 82.6

Imagga
created on 2022-01-15

newspaper 19.3
person 17.8
adult 17.7
portrait 16.2
people 16.2
dress 15.3
child 14.8
product 14.8
outdoor 14.5
mother 13.8
clothing 13.3
man 12.8
couple 12.2
kin 12.1
musical instrument 11.6
creation 11.6
male 11.5
fashion 11.3
outdoors 11.2
accordion 11.1
summer 10.9
park 10.7
hair 10.3
women 10.3
parent 10.1
lifestyle 10.1
model 10.1
world 9.9
face 9.9
attractive 9.8
happy 9.4
smile 9.3
human 9
keyboard instrument 8.9
love 8.7
dog 8.7
two 8.5
power 8.4
old 8.4
danger 8.2
lady 8.1
day 7.8
happiness 7.8
tree 7.8
pretty 7.7
sky 7.6
dark 7.5
holding 7.4
water 7.3
wind instrument 7.3
looking 7.2
travel 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98
outdoor 95.7
clothing 79.3
person 68.9
man 59.8
black and white 55.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 56.3%
Surprised 62.9%
Calm 26.8%
Sad 3.8%
Happy 3.2%
Angry 1.3%
Disgusted 1.1%
Confused 0.6%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Text analysis

Amazon

35424
BE
NAGOY
VI37A3

Google

カカSE
SE