Human Generated Data

Title

Untitled (woman seated at work space)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8206

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated at work space)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8206

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.1
Human 98.1
Chair 97.2
Furniture 97.2
Person 96.5
Interior Design 91.2
Indoors 91.2
Chair 89.5
Person 87.9
Person 86
Person 74.7
Face 73.4
Building 72.4
Room 68.8
People 62.2
Art 60.4
Clothing 59.7
Apparel 59.7
Brick 58.9
Architecture 57.6
X-Ray 55.7
Ct Scan 55.7
Medical Imaging X-Ray Film 55.7
Person 43.9
Person 41.8

Clarifai
created on 2023-10-25

people 98.7
monochrome 98.1
adult 92.1
room 91.3
no person 90.9
man 90
indoors 89.1
furniture 88.4
music 87.6
chair 86.2
group 85.3
architecture 81.1
street 78.7
art 78.4
illustration 76.8
industry 76.7
building 76.6
technology 76.3
many 75.3
design 75.1

Imagga
created on 2022-01-08

architecture 64.9
building 40
monument 34.6
arch 32.6
city 32.4
column 31.8
landmark 29.8
history 29.5
ancient 28.6
tourism 24.8
stone 24.6
old 24.4
facade 24.4
marble 24.1
statue 24
travel 24
famous 23.3
church 23.1
historic 22.9
structure 21.3
sculpture 20.6
historical 19.8
balcony 18.9
religion 18.8
art 18.4
cathedral 17.7
window 15.9
tourist 15.6
culture 15.4
memorial 14.1
exterior 13.8
street 13.8
house 12.9
antique 12.1
town 12.1
fountain 11.9
sky 10.8
catholic 10.7
door 10.6
god 10.5
urban 10.5
wall 10.5
style 10.4
architectural 9.6
medieval 9.6
religious 9.4
sketch 9.3
roman 8.9
white goods 8.8
capital 8.5
place 8.4
traditional 8.3
palace 8.3
museum 8.1
home appliance 8
home 8
pillar 7.9
dome 7.8
century 7.8
baroque 7.8
drawing 7.7
entrance 7.7
windows 7.7
buildings 7.6
destination 7.5
outdoors 7.5
vintage 7.5
light 7.4
national 7.3
detail 7.2
device 7.1
interior 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 95.5
black and white 72.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Calm 98.6%
Surprised 0.6%
Disgusted 0.2%
Confused 0.2%
Happy 0.1%
Sad 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Male, 99.2%
Happy 81.6%
Sad 5.9%
Calm 3.6%
Surprised 3.2%
Confused 2.1%
Disgusted 1.4%
Fear 1.2%
Angry 1.1%

AWS Rekognition

Age 21-29
Gender Male, 85.5%
Calm 53.3%
Sad 42.8%
Angry 1.3%
Confused 0.8%
Disgusted 0.7%
Fear 0.4%
Happy 0.3%
Surprised 0.3%

AWS Rekognition

Age 37-45
Gender Male, 84.9%
Calm 74.5%
Sad 19%
Happy 1.8%
Confused 1.6%
Fear 1%
Disgusted 0.9%
Surprised 0.7%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Chair 97.2%

Categories

Captions

Text analysis

Amazon

5932
zebs
SVEETS
БИДБИ SVEETS
БИДБИ
BUT