Human Generated Data

Title

Untitled (portrait of woman with two girls on living room couch, Philadelphia)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12077

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (portrait of woman with two girls on living room couch, Philadelphia)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12077

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 99.4
Person 97.7
Clothing 97.5
Apparel 97.5
Furniture 97
Helmet 96.1
Sitting 87.3
Couch 87
Home Decor 81.8
Shoe 79.7
Footwear 79.7
Chair 70.8
Sunglasses 69.3
Accessories 69.3
Accessory 69.3
Shorts 66.9
Overcoat 62.5
Coat 62.5
Person 62.4
Suit 62.2
Face 61.7
Female 57.9
Room 56.8
Indoors 56.8
LCD Screen 56.2
Electronics 56.2
Screen 56.2
Monitor 56.2
Display 56.2

Clarifai
created on 2023-10-26

people 99.8
woman 97.9
adult 96.3
furniture 96
seat 95.9
two 95.7
monochrome 95.3
street 94.6
man 94.5
chair 93.6
sit 93.6
room 93
child 91.5
portrait 91.2
art 89.6
actress 87.3
group 86.6
wear 85.2
one 84.8
wedding 82.9

Imagga
created on 2022-01-15

people 33.5
man 28.9
person 26.8
room 23.7
adult 23.1
couple 21.8
male 21.6
women 18.2
portrait 18.1
men 18
home 16.7
indoors 16.7
happy 16.3
love 15.8
black 15.7
sitting 15.5
attractive 15.4
business 15.2
together 14.9
sexy 14.5
dress 14.4
leisure 14.1
kin 14
fashion 13.6
lifestyle 13
indoor 12.8
happiness 11.7
smiling 11.6
barbershop 11.5
office 11.5
businessman 11.5
computer 11.3
chair 10.9
salon 10.8
smile 10.7
shop 10.6
lady 10.5
world 10.5
one 10.4
looking 10.4
laptop 10.3
casual 10.2
mother 10.1
elegance 10.1
pretty 9.8
old 9.7
cheerful 9.7
style 9.6
window 9.6
passion 9.4
friends 9.4
two 9.3
relaxation 9.2
child 9.2
silhouette 9.1
suit 9
professional 8.9
family 8.9
drinking 8.6
elegant 8.6
model 8.6
youth 8.5
mercantile establishment 8.4
alone 8.2
retro 8.2
posing 8
body 8
interior 8
working 8
hair 7.9
parent 7.9
day 7.8
standing 7.8
face 7.8
full length 7.8
corporate 7.7
modern 7.7
husband 7.6
daughter 7.6
human 7.5
relationship 7.5
fun 7.5
vintage 7.4
classroom 7.4
light 7.3
20s 7.3
businesswoman 7.3
aged 7.2
music 7.2
holiday 7.2
blackboard 7.1
night 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99
black and white 96.3
clothing 95
person 94.1
furniture 87.9
monochrome 80.6
man 79.3
street 78.7
footwear 74.2
chair 66

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 94.6%
Calm 98.3%
Sad 1.5%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 97%
Sad 51.8%
Calm 21.9%
Fear 12.1%
Angry 5.6%
Disgusted 3%
Happy 2.6%
Surprised 2.1%
Confused 1%

AWS Rekognition

Age 18-26
Gender Female, 62.1%
Sad 74.7%
Confused 19.6%
Surprised 2.1%
Calm 1.3%
Disgusted 0.9%
Fear 0.6%
Angry 0.4%
Happy 0.3%

AWS Rekognition

Age 35-43
Gender Male, 90.2%
Calm 69.7%
Sad 10.5%
Happy 6.2%
Disgusted 3.9%
Surprised 3.8%
Angry 3.2%
Confused 1.5%
Fear 1.3%

AWS Rekognition

Age 33-41
Gender Male, 90.4%
Calm 99.9%
Happy 0%
Surprised 0%
Disgusted 0%
Sad 0%
Confused 0%
Fear 0%
Angry 0%

Feature analysis

Amazon

Person 99.4%
Helmet 96.1%
Couch 87%
Shoe 79.7%
Sunglasses 69.3%

Text analysis

Amazon

MJI3
MJI3 A70A
6730
A70A

Google

MJI7 YT33A 2 AJJA
MJI7
YT33A
2
AJJA