Human Generated Data

Title

Untitled (men and women standing near fireplace with drinks and food, Jos. WHarton Estate (Lippincott))

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5159

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women standing near fireplace with drinks and food, Jos. WHarton Estate (Lippincott))

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5159

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.6
Person 99.6
Person 99.3
Chair 98.9
Furniture 98.9
Person 98.7
Person 98.5
Clinic 98
Person 97.6
Person 97.2
Person 97.2
Person 90.6
Hospital 86.5
Person 84.7
Operating Theatre 83.8
Person 80.7
Person 78.8
Person 76.6
Indoors 71.4
Person 71.1
Person 70.4
Room 65.7
Doctor 57.9
Surgery 55.2

Clarifai
created on 2023-10-26

people 99.7
group 98.3
man 97.2
adult 96.9
education 95.8
many 95.8
woman 94.2
monochrome 91.1
administration 90.7
school 89.4
leader 89.2
group together 89
teacher 87.8
war 86.2
classroom 84.8
sitting 84.5
child 83.4
chair 83.1
league 81
sit 79.3

Imagga
created on 2022-01-23

brass 31.8
newspaper 30.2
product 26.5
wind instrument 24.7
people 24.5
person 24.3
business 24.3
creation 21.3
daily 19.6
man 18.8
businessman 18.5
male 18.4
work 17.8
nurse 17.6
musical instrument 16.7
adult 15.9
team 14.3
group 13.7
human 13.5
job 13.3
men 12.9
office 12.8
professional 12.6
silhouette 11.6
health 11.1
grunge 11.1
worker 10.8
city 10.8
drawing 10.4
plan 10.4
portrait 10.3
construction 10.3
manager 10.2
businesswoman 10
life 9.9
sign 9.8
working 9.7
medical 9.7
technology 9.6
urban 9.6
women 9.5
sketch 9.2
suit 9
room 8.8
smiling 8.7
lifestyle 8.7
chart 8.6
architecture 8.6
meeting 8.5
doctor 8.5
planner 8.4
modern 8.4
sky 8.3
symbol 8.1
medicine 7.9
design 7.9
day 7.8
couple 7.8
standing 7.8
model 7.8
old 7.7
cornet 7.5
company 7.4
teamwork 7.4
20s 7.3
backgrounds 7.3
graphic 7.3
success 7.2
looking 7.2
clothing 7.1
indoors 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.3
person 95.1
clothing 82.8
furniture 78.3
table 78.3
chair 74.7
woman 71.1
people 66.8
wedding dress 63.1
group 57

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 83.9%
Happy 92.7%
Sad 2%
Calm 1.7%
Disgusted 1%
Angry 0.9%
Confused 0.6%
Surprised 0.6%
Fear 0.6%

AWS Rekognition

Age 51-59
Gender Male, 89.2%
Sad 58.7%
Calm 21%
Happy 9.1%
Confused 5.3%
Angry 2.4%
Surprised 1.7%
Disgusted 1.3%
Fear 0.6%

AWS Rekognition

Age 48-56
Gender Female, 96.2%
Calm 87.9%
Happy 8.7%
Sad 1.8%
Angry 0.4%
Surprised 0.4%
Disgusted 0.4%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Female, 94.7%
Sad 81%
Calm 9.5%
Confused 4.1%
Happy 3.8%
Fear 0.5%
Disgusted 0.4%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 47-53
Gender Male, 51.1%
Calm 99.9%
Confused 0%
Sad 0%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 98.9%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Amazon

13704
ИАМТ2АЗ

Google

13704. 13704.
13704.