Human Generated Data

Title

Untitled (man and woman in crowded store with Christmas (?) display)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4480

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman in crowded store with Christmas (?) display)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4480

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.6
Person 99.6
Person 99.6
Person 99.5
Person 98.6
Person 98.3
Apparel 97
Clothing 97
Person 95.5
Person 92
Person 91.8
Female 85
Dress 79.5
Person 78.1
Face 75.9
Person 72
People 71.5
Crowd 68.7
Woman 68.5
Helmet 62.5
Person 61.7
Photography 61
Photo 61
Portrait 60.9
Party 60
Beverage 57.5
Drink 57.5

Clarifai
created on 2023-10-26

people 99.8
many 98.8
man 98.1
group 97.9
adult 96.4
woman 94.3
group together 92.1
crowd 88.6
wear 87.5
administration 83.1
leader 83
several 82.1
child 80.7
wedding 78.7
military 78
music 75.8
ceremony 74.1
musician 73.7
monochrome 73.3
celebration 73

Imagga
created on 2022-01-23

brass 36
wind instrument 30.6
people 28.4
businessman 27.3
male 26.9
person 26.6
man 25.5
business 24.3
cornet 23.9
musical instrument 21.6
adult 17.7
work 17.3
professional 16.6
job 15.9
men 15.4
looking 15.2
group 14.5
office 14.4
human 12.7
suit 12.7
team 12.5
medical 12.3
worker 12
technology 11.9
old 11.1
day 11
chart 10.5
building 10.4
doctor 10.3
bride 10.3
corporate 10.3
manager 10.2
film 10.1
face 9.9
writing 9.9
pensive 9.8
new 9.7
portrait 9.7
design 9.6
serious 9.5
businesspeople 9.5
happy 9.4
two 9.3
negative 9.2
room 9.1
hand 9.1
drawing 8.9
handsome 8.9
indoors 8.8
symbol 8.7
lab 8.7
colleagues 8.7
couple 8.7
chemistry 8.7
engineer 8.7
architecture 8.6
plan 8.5
coat 8.5
health 8.3
cheerful 8.1
success 8
graphic 8
medicine 7.9
designing 7.9
scientist 7.8
organizer 7.8
black 7.8
education 7.8
laboratory 7.7
test 7.7
grunge 7.7
case 7.6
research 7.6
pencil 7.6
finance 7.6
patient 7.6
biology 7.6
groom 7.5
horizontal 7.5
sign 7.5
company 7.4
teamwork 7.4
bar 7.4
occupation 7.3
clothing 7.3
baron 7.3
indoor 7.3
treasury 7.2
idea 7.1
love 7.1
working 7.1
stage 7.1
happiness 7

Google
created on 2022-01-23

Photograph 94.2
Adaptation 79.2
Art 77.6
Snapshot 74.3
Event 70.7
Font 70.6
Room 70.2
Vintage clothing 69.6
Illustration 68
Hat 65.8
Plant 65.7
Stock photography 64.3
Monochrome photography 63.9
History 63
Service 61.9
Monochrome 61
Visual arts 60.3
Team 55.7
Uniform 55.3
Pattern 54.3

Microsoft
created on 2022-01-23

text 99.1
person 99.1
drawing 93.6
sketch 90.5
clothing 85.8
cartoon 76
man 67.8
posing 62.8
old 40.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 58.9%
Happy 47.7%
Angry 24.3%
Calm 19.5%
Disgusted 3.2%
Fear 2%
Confused 1.4%
Surprised 1.3%
Sad 0.6%

AWS Rekognition

Age 38-46
Gender Male, 98.2%
Calm 68.8%
Sad 27.2%
Confused 1.1%
Surprised 1%
Angry 1%
Disgusted 0.3%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 99.6%
Calm 95.4%
Happy 3.8%
Angry 0.3%
Confused 0.2%
Sad 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 56-64
Gender Male, 99.5%
Calm 99.8%
Sad 0.1%
Confused 0%
Angry 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 54-62
Gender Male, 99.4%
Sad 41.3%
Fear 28.4%
Happy 13.1%
Confused 8.6%
Surprised 5.9%
Disgusted 1.3%
Angry 0.8%
Calm 0.6%

AWS Rekognition

Age 26-36
Gender Female, 95.1%
Calm 99.6%
Sad 0.2%
Surprised 0%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 99.3%
Calm 75.3%
Sad 18.1%
Confused 3.7%
Happy 1.5%
Angry 0.5%
Disgusted 0.3%
Surprised 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Helmet 62.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

21447
LAMIE