Human Generated Data

Title

Untitled (audience in Heinz kitchen dispay room)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8377

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (audience in Heinz kitchen dispay room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8377

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Interior Design 95.3
Indoors 95.3
Human 94.4
Person 94.3
Person 93.9
Audience 93.1
Crowd 93.1
Room 91.7
Person 86.8
Furniture 86.2
Person 83.6
Person 82.6
Person 81.4
Classroom 79.6
School 79.6
Chair 77
Monitor 76.7
Electronics 76.7
Display 76.7
Screen 76.7
Cafeteria 75.8
Restaurant 75.8
Person 73.2
Person 73
LCD Screen 71
Couch 67.6
People 66.8
Person 63.7
Person 62.6
Sitting 60.4
Person 60.1
Urban 59.1
Living Room 56.5
Person 55.6
Text 55.2

Clarifai
created on 2023-10-25

people 99.9
many 99.5
group 99
education 98.3
adult 97.5
audience 97.1
school 95.5
man 95.1
woman 94.8
classroom 94.8
music 91.8
child 90.7
presentation 90.4
crowd 90
leader 89.1
administration 89
group together 88.1
stump 86.4
room 85.6
chair 85.5

Imagga
created on 2022-01-09

art 25.5
classroom 24.5
texture 24.3
pattern 23.9
room 21.9
retro 20.5
rough 20
cemetery 20
surface 19.4
grunge 18.7
design 18.5
city 18.3
wallpaper 17.6
diagram 17.2
modern 16.8
squares 16.6
graphic 16
old 16
contemporary 16
colors 15.9
town 15.7
urban 15.7
visual 15.4
unique 15.2
buildings 15.1
grain 14.7
figures 14.5
floral 14.5
circles 14.4
lines 14.4
plants 13.9
flower 13.8
progressive 13.8
plant 13.4
seamless 13.3
conceptual 13.2
vibrant 13.1
building 12.1
colorful 11.5
flowers 11.3
people 11.1
architecture 10.9
aerial 10.7
fresh 10.5
blackboard 10.2
hall 9.6
shop 9.6
ancient 9.5
cityscape 9.4
center 9.2
business 9.1
dirty 9
black 9
structure 8.9
antique 8.8
skyline 8.5
grungy 8.5
3d 8.5
vintage 8.4
person 8.2
aged 8.1
textured 7.9
wall 7.8
scene 7.8
travel 7.7
drawing 7.7
sky 7.6
dark 7.5
shapes 7.5
landscape 7.4
backdrop 7.4
decoration 7.3
new 7.3
color 7.2
man 7.1
game 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.4
person 91.9
building 67.7
table 63.4
white 60.1
old 47.3
shop 16.4
crowd 1.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Female, 79.6%
Calm 74.2%
Sad 18.5%
Happy 3%
Angry 1.1%
Disgusted 1%
Confused 0.9%
Fear 0.7%
Surprised 0.5%

AWS Rekognition

Age 39-47
Gender Female, 78.9%
Sad 51.1%
Calm 40.2%
Disgusted 2.9%
Confused 1.8%
Happy 1.6%
Angry 1.2%
Fear 0.8%
Surprised 0.5%

AWS Rekognition

Age 35-43
Gender Female, 67.9%
Sad 65%
Calm 21.1%
Fear 5.8%
Happy 4.6%
Confused 1.9%
Disgusted 0.6%
Angry 0.6%
Surprised 0.4%

AWS Rekognition

Age 24-34
Gender Female, 92.7%
Calm 68.2%
Surprised 12.1%
Sad 4.7%
Disgusted 4.4%
Fear 4.2%
Confused 3.8%
Happy 1.4%
Angry 1.1%

Feature analysis

Amazon

Person 94.3%

Text analysis

Amazon

57
17226
17226.
-HAMTPA
XA00X YE3A -HAMTPA
XA00X
YE3A
MALC.

Google

17226 YT37A2- AATA
17226
YT37A2-
AATA