Human Generated Data

Title

Untitled (circus employees sitting around table with dog)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7374

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus employees sitting around table with dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7374

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 99.4
Person 98.9
Human 98.9
Person 98.2
Dining Table 98
Table 98
Person 98
Person 96.7
Dining Room 92.7
Room 92.7
Indoors 92.7
Chair 90.8
Meal 90.6
Food 90.6
Dish 79.9
Restaurant 79.6
People 75.2
Military Uniform 62.8
Military 62.8
Female 61.2
Couch 60.5
Cafeteria 59
Girl 58.9
Soldier 58.7
Photography 56.3
Photo 56.3
Dinner 56
Supper 56

Clarifai
created on 2023-10-25

people 99.9
child 97.9
group 96.9
adult 96
man 95.8
monochrome 94.3
sit 93.9
chair 92
education 91.8
group together 91
woman 90.8
furniture 90.6
boy 88.6
family 86.3
indoors 84.7
teacher 83.8
table 83.3
room 80.7
son 78.5
nostalgia 78.1

Imagga
created on 2022-01-08

laptop 29.8
man 29.6
computer 25.8
people 25.6
male 25.5
business 25.5
person 24.1
work 22.1
office 20.4
blackboard 19.7
adult 18.8
technology 17.1
happy 16.9
home 15.9
working 15.9
classroom 15.5
room 15.1
senior 15
brass 14.8
smiling 14.5
looking 14.4
businessman 14.1
education 13.8
sitting 13.7
world 13.6
team 13.4
desk 13.3
old 13.2
businesswoman 12.7
finance 12.7
paper 12.6
worker 12.5
wind instrument 12
casual 11.9
indoors 11.4
smile 11.4
group 11.3
mature 11.1
money 11.1
portrait 11
lifestyle 10.8
job 10.6
professional 10.6
notebook 10.5
one 10.4
student 10.3
men 10.3
dollar 10.2
teamwork 10.2
newspaper 9.9
musical instrument 9.6
together 9.6
couple 9.6
elderly 9.6
screen 9.5
meeting 9.4
keyboard 9.4
iron lung 9.1
indoor 9.1
women 8.7
corporate 8.6
table 8.6
daily 8.4
product 8.3
currency 8.1
device 8.1
success 8
monitor 8
copy 8
cornet 7.9
teaching 7.8
older 7.8
attractive 7.7
retirement 7.7
communication 7.6
vintage 7.4
teacher 7.3
respirator 7.3
aged 7.2
bank 7.2
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.2
table 97.8
sitting 96.3
person 93.3
laptop 93.1
furniture 92
clothing 91.8
chair 88.4
people 68.1
working 58.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 78.4%
Sad 87.6%
Calm 10.4%
Confused 0.7%
Angry 0.5%
Disgusted 0.3%
Surprised 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 54.5%
Calm 88.7%
Sad 7.6%
Happy 1.5%
Angry 1%
Disgusted 0.4%
Confused 0.4%
Fear 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Chair 90.8%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

19064
19064.
for
MАOОN-YTERA
MАOОN-YTERA -ИАМТ2АВ
-ИАМТ2АВ

Google

190 64. 65 U Arinuirr
190
64.
65
U
Arinuirr