Human Generated Data

Title

Untitled (men and women in room with stripped walls)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8454

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women in room with stripped walls)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8454

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 98.9
Clothing 97.6
Apparel 97.6
Person 97.4
Person 96.4
Furniture 91.9
Sitting 88.7
Couch 82.1
Face 81.7
Chair 78.3
Flooring 76.8
Overcoat 75.6
Coat 75.6
Suit 74.5
Floor 61.6
Door 57.9

Clarifai
created on 2023-10-26

people 99.8
group 98.3
adult 97.6
monochrome 97.6
woman 97.5
group together 95.1
actress 94.5
furniture 93.6
room 93.4
man 93.1
wear 92.8
two 91.5
dressing room 89.3
actor 88.8
administration 87.7
chair 87.3
three 86.1
street 82.5
commerce 81.4
several 80.8

Imagga
created on 2022-01-15

barbershop 49.2
shop 44.3
chair 37.8
mercantile establishment 32.1
barber chair 30.4
seat 25.9
man 22.8
indoors 22.8
place of business 21.7
people 20.1
interior 19.4
furniture 18.6
room 17.4
salon 15.8
building 15.7
male 15.6
men 15.5
office 14.3
inside 13.8
architecture 13.3
home 12.8
business 12.7
old 11.8
lifestyle 11.6
hairdresser 11.3
establishment 11.1
women 11.1
equipment 10.7
table 10.4
sitting 10.3
window 10.2
door 10
adult 9.8
modern 9.8
work 9.4
vintage 9.1
person 9
businessman 8.8
house 8.4
indoor 8.2
computer 8.1
history 8
family 8
love 7.9
design 7.9
wood 7.5
mature 7.4
historic 7.3
suit 7.2
portrait 7.1

Google
created on 2022-01-15

Black 89.5
Chair 87.4
Black-and-white 85.1
Style 83.9
Monochrome photography 76.7
Monochrome 75.5
Font 75.2
Curtain 73.5
Beauty salon 72.6
Art 69.9
Event 68.4
Table 67.9
Room 67.7
Fashion design 67.3
Sitting 66.7
Vintage clothing 66.3
Stock photography 64.1
Barber 58.8
Machine 56.1
Visual arts 55.1

Microsoft
created on 2022-01-15

text 97.9
furniture 95.1
chair 85.7
clothing 85.1
person 84
table 74.7
man 64.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 100%
Surprised 38.2%
Calm 28.8%
Happy 13.2%
Disgusted 7.9%
Confused 7.6%
Sad 3.2%
Angry 0.6%
Fear 0.5%

AWS Rekognition

Age 48-54
Gender Female, 92.5%
Sad 48.6%
Calm 45.7%
Angry 2.1%
Surprised 1.4%
Confused 1.4%
Disgusted 0.5%
Fear 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

14494
14994.
1449

Google

144 •NAGON-YT3RA2-MAMT2A3 14994. 9994.
144
•NAGON-YT3RA2-MAMT2A3
14994.
9994.