Human Generated Data

Title

Untitled (two women and a man in a room with books)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10543

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man in a room with books)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10543

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Room 99.4
Indoors 99.4
Person 97.9
Human 97.9
Person 97.4
Furniture 95.6
Clothing 94.8
Apparel 94.8
Library 90
Book 90
Female 85.6
Bookcase 82.6
Footwear 81.4
Shoe 81.4
Shop 78.5
Shelf 75.4
Woman 71.9
Sleeve 62.3
Interior Design 60.3
Text 60.2
Dress 56.4

Clarifai
created on 2023-10-25

people 99.9
two 99.2
monochrome 98
adult 97.9
woman 97.2
group 96.7
man 96.1
newspaper 94.9
three 93.3
wear 92.7
elderly 91.5
leader 90.7
administration 87.5
group together 87
furniture 86
portrait 84.4
home 82.5
street 82
actress 81.3
room 79.1

Imagga
created on 2022-01-09

people 25.1
man 21.5
male 21.4
person 19.6
adult 18.8
fashion 17.3
portrait 16.2
city 15.8
men 15.4
building 14.9
old 14.6
wall 14.5
happy 14.4
human 14.2
women 14.2
window 14.2
urban 14
business 13.4
interior 13.3
standing 13
shop 12.8
door 12.8
attractive 12.6
life 12.4
couple 12.2
pretty 11.9
dress 11.7
black 11.6
lady 11.4
architecture 11.2
corporate 11.2
home 11.2
two 11
lifestyle 10.8
outdoors 10.4
ancient 10.4
street 10.1
model 10.1
vintage 9.9
family 9.8
clothing 9.6
happiness 9.4
suit 9.3
elegance 9.2
professional 9
room 8.8
entrance 8.7
cute 8.6
travel 8.4
blond 8.4
world 8.4
house 8.4
historic 8.2
retro 8.2
mother 8.1
group 8.1
looking 8
groom 7.9
love 7.9
hand 7.6
one 7.5
alone 7.3
girls 7.3
work 7.2
hair 7.1
smile 7.1
posing 7.1
face 7.1
indoors 7
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.7
person 98.5
man 91.1
clothing 89.7
dress 89.6
black and white 88.4
woman 78.7
street 72.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 98.7%
Sad 70.4%
Calm 13.7%
Surprised 5%
Confused 3.6%
Fear 3%
Happy 2.2%
Disgusted 1.6%
Angry 0.6%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Happy 91.4%
Sad 6.4%
Calm 0.6%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 33.9%
Confused 24.8%
Surprised 14.4%
Sad 9.5%
Happy 7.4%
Disgusted 4%
Angry 3.2%
Fear 2.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Shoe 81.4%

Text analysis

Amazon

4
20294.

Google

20294.
111
20294. 111 0294.
0294.