Human Generated Data

Title

Untitled (couple in trailer home with piano, Sarasota, Florida)

Date

1954, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple in trailer home with piano, Sarasota, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-05-27

Person 99.8
Human 99.8
Person 99.5
Furniture 92.4
LCD Screen 88.5
Electronics 88.5
Monitor 88.5
Screen 88.5
Display 88.5
Pc 76.4
Computer 76.4
Plant 70
Chair 69.6
Sitting 68.4
Produce 59.2
Food 59.2
Shelf 58.4
Table 56.1

Clarifai
created on 2023-10-30

people 99.9
two 98.7
man 98.3
adult 98.1
woman 97.7
group 97.4
room 97.1
furniture 96.5
monochrome 95.7
four 92.5
three 91.7
group together 91.4
child 91.1
apple 89.6
portrait 87
one 86.2
son 83.6
indoors 83.5
several 82.9
offspring 82.8

Imagga
created on 2022-05-27

ball 28.9
musical instrument 21
adult 18.2
world 17.5
people 17.3
person 16.5
game equipment 16
equipment 15.7
male 15
happy 15
man 14.8
lifestyle 14.4
child 14.4
happiness 14.1
attractive 14
accordion 12.8
food 12.7
smiling 11.6
soccer ball 11.5
smile 11.4
holding 10.7
healthy 10.7
looking 10.4
keyboard instrument 10.3
seller 10.2
life 10.1
cheerful 9.7
couple 9.6
men 9.4
cute 9.3
wind instrument 9.3
pretty 9.1
human 9
brown 8.8
business 8.5
old 8.4
leisure 8.3
girls 8.2
percussion instrument 8.1
kitchen 8
family 8
home 8
device 8
nurse 7.9
grass 7.9
love 7.9
kin 7.9
outside 7.7
hand 7.6
eat 7.5
planet 7.5
fun 7.5
occupation 7.3
dirty 7.2
dress 7.2
fresh 7.2
hair 7.1
handsome 7.1
portrait 7.1
women 7.1
face 7.1

Google
created on 2022-05-27

Photograph 94.4
White 92.2
Black 89.7
Standing 86.4
Black-and-white 84.1
Style 84
Monochrome 78.2
Monochrome photography 75.6
Snapshot 74.3
Table 68.2
Food 66.2
Room 66.1
Vintage clothing 64.8
Cooking 64.3
Sitting 64.3
Stock photography 64.1
Kitchen 58.9
Tableware 58.2
Ball 56.5
House 55.6

Microsoft
created on 2022-05-27

indoor 98.5
window 97.3
floor 96.4
person 90.1
text 85.1
black and white 82.9
clothing 70.3
furniture 64.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 99.9%
Happy 92%
Surprised 6.6%
Fear 6%
Sad 2.7%
Disgusted 2.4%
Angry 1.6%
Calm 1%
Confused 0.2%

AWS Rekognition

Age 53-61
Gender Male, 100%
Happy 99.3%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Chair 69.6%

Categories