Human Generated Data

Title

Untitled (groups of people seated at outdoor tables)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5335

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (groups of people seated at outdoor tables)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5335

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.6
Human 99.6
Chair 99.6
Furniture 99.6
Person 99.6
Person 99.4
Person 99.2
Person 98.9
Person 98.8
Person 97.4
Person 91.8
People 89
Person 87.9
Chair 87.3
Crowd 86.6
Face 86.2
Person 80.3
Clothing 72.1
Apparel 72.1
Female 67.3
Tie 66.8
Accessories 66.8
Accessory 66.8
Tie 66.1
Photography 63.6
Photo 63.6
Outdoors 62.9
Person 62.4
Girl 60.1
Indoors 58.8
Room 58.4
Audience 58.3
Sitting 56.3
Text 56.3

Clarifai
created on 2023-10-26

people 99.9
group 99.2
many 98.8
adult 98.5
group together 98.3
man 98.1
woman 97.6
furniture 91.5
leader 90.6
chair 90.6
crowd 89.9
sit 89.2
administration 87.3
meeting 86.1
recreation 85.4
war 83.2
monochrome 81.8
child 81.4
military 81.3
sitting 78.5

Imagga
created on 2022-01-22

brass 86.5
wind instrument 66.7
cornet 48.5
musical instrument 45.8
businessman 34.4
male 32.6
business 31.6
man 29.5
people 26.2
group 25
person 24.4
work 22.7
office 22.6
meeting 18.8
job 18.6
room 17.4
team 17
men 16.3
education 15.6
adult 15.2
plan 15.1
manager 14.9
chart 14.3
desk 14.3
teamwork 13.9
blackboard 13.7
success 13.7
drawing 13.4
student 12.9
looking 12.8
teacher 12.7
teaching 12.6
professional 12.4
design 12.4
classroom 12.1
technology 11.9
finance 11.8
businesswoman 11.8
worker 11.6
hand 11.4
happy 11.3
modern 11.2
sign 10.5
new 10.5
table 10.5
businesspeople 10.4
corporate 10.3
construction 10.3
black 10.2
smiling 10.1
board 9.9
silhouette 9.9
engineer 9.8
designer 9.7
sky 9.6
pencil 9.5
school 9.2
successful 9.1
indoor 9.1
suit 9
human 9
client 8.8
organizer 8.8
conference 8.8
pensive 8.8
architect 8.7
class 8.7
stage 8.6
cloud 8.6
engineering 8.6
grunge 8.5
writing 8.5
bar 8.3
laptop 8.2
symbol 8.1
idea 8
interior 8
women 7.9
designing 7.9
smile 7.8
circuit 7.8
boy 7.8
architecture 7.8
builder 7.7
employee 7.7
project 7.7
diagram 7.7
serious 7.6
customer 7.6
trombone 7.5
building 7.5
presentation 7.4
company 7.4
handsome 7.1
working 7.1

Google
created on 2022-01-22

Chair 88.4
Font 80.3
Hat 79.4
Adaptation 79.2
Suit 78.9
Event 73.3
Room 70.4
Vintage clothing 69.5
Monochrome 66.9
History 65.3
Stock photography 63.9
Sitting 63.9
Monochrome photography 62.8
Recreation 62.6
Team 61.8
Art 60.4
Photo caption 58.3
Sun hat 55.9
Conversation 54.2
Crew 53.7

Microsoft
created on 2022-01-22

person 97.8
text 97.4
clothing 95.1
outdoor 87.7
man 82.1
woman 77.2
drawing 61.3
old 59.1
people 56.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 77.3%
Happy 93.7%
Sad 3.7%
Confused 1.1%
Surprised 0.4%
Disgusted 0.4%
Fear 0.3%
Calm 0.2%
Angry 0.2%

AWS Rekognition

Age 42-50
Gender Female, 99%
Surprised 72.2%
Calm 21.3%
Happy 4.1%
Angry 0.7%
Confused 0.5%
Disgusted 0.5%
Sad 0.4%
Fear 0.3%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Sad 99.6%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Calm 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 93%
Sad 96.4%
Calm 1.3%
Confused 1.2%
Happy 0.3%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 38-46
Gender Male, 92.2%
Calm 99.3%
Sad 0.3%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 38-46
Gender Female, 97.3%
Calm 57.6%
Confused 10.4%
Surprised 9.1%
Sad 7.3%
Fear 5%
Disgusted 4.3%
Happy 4%
Angry 2.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 99.6%
Tie 66.8%

Categories

Imagga

paintings art 99.6%

Text analysis

Amazon

19542.
17542.

Google

17542. 19542.
17542.
19542.