Human Generated Data

Title

Untitled (four women in living room with Old Gold cigarettes for advertisement, Haverford, PA)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11950

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women in living room with Old Gold cigarettes for advertisement, Haverford, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11950

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Furniture 96.9
Couch 96.9
Person 96.9
Indoors 93.6
Living Room 93.6
Room 93.6
Clothing 91.6
Apparel 91.6
Person 91.2
Flooring 84.3
Sitting 80.5
Chair 78.4
Monitor 78.2
Electronics 78.2
Display 78.2
Screen 78.2
Floor 77.2
LCD Screen 70.4
Shorts 66.1
Wood 61.2
Bed 61
Person 53

Clarifai
created on 2023-10-25

people 99.9
adult 98.6
group 97.5
furniture 97.4
woman 96.2
man 96
group together 95.8
seat 95.1
administration 94.8
sit 93.4
two 93.1
chair 92.8
room 92.3
wear 91.9
three 91.4
actress 90.6
several 90.5
actor 90.3
four 88.8
musician 88.5

Imagga
created on 2022-01-15

barbershop 100
shop 100
mercantile establishment 87
place of business 58
chair 32.3
establishment 29
man 28.2
room 28
people 25.1
indoors 22.8
sitting 20.6
interior 20.3
lifestyle 19.5
computer 19.2
table 19.1
person 18.5
indoor 18.3
women 17.4
male 17
classroom 16.9
home 16
office 15.9
adult 15.5
business 15.2
men 14.6
furniture 14.3
desk 13.2
working 12.4
inside 12
barber chair 11.8
couple 11.3
seat 11.2
mature 11.2
portrait 11
occupation 11
laptop 10.9
communication 10.9
smile 10.7
happy 10.6
businessman 10.6
modern 10.5
group 10.5
two 10.2
smiling 10.1
restaurant 9.8
work 9.4
casual 9.3
window 9.2
teacher 9.1
technology 8.9
together 8.8
class 8.7
talking 8.6
meeting 8.5
relax 8.4
relaxation 8.4
house 8.4
salon 8.3
color 8.3
phone 8.3
back 8.3
relaxing 8.2
hairdresser 8.1
job 8
love 7.9
building 7.9
education 7.8
glass 7.8
old 7.7
sofa 7.7
senior 7.5
style 7.4
light 7.4
lady 7.3
aged 7.2
professional 7.2
happiness 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.6
indoor 85.4
person 84.8
clothing 78.3
black and white 75.5
furniture 63.4
woman 63.1
table 59.6
desk 6.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 57.5%
Calm 96.6%
Sad 1.7%
Fear 0.7%
Happy 0.4%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%
Surprised 0.1%

AWS Rekognition

Age 24-34
Gender Female, 55.4%
Calm 87.1%
Sad 6.6%
Fear 2.7%
Happy 1.1%
Surprised 0.7%
Confused 0.7%
Disgusted 0.7%
Angry 0.4%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Sad 49.8%
Confused 34.4%
Disgusted 6.8%
Angry 3.6%
Surprised 1.8%
Fear 1.6%
Calm 1.2%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 78.4%

Text analysis

Amazon

MJI7
MJI7 A70A
A70A