Human Generated Data

Title

Untitled (two women and a man performing at actors' theater)

Date

1969

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11370

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man performing at actors' theater)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11370

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.6
Human 99.6
Person 99.5
Person 99.4
Clothing 99.1
Apparel 99.1
Female 87.8
Flooring 86.5
Chair 80.9
Furniture 80.9
Sleeve 80.1
Shorts 75.5
Shoe 74.7
Footwear 74.7
Woman 72.2
Floor 71.5
Long Sleeve 69.6
Suit 68.9
Overcoat 68.9
Coat 68.9
People 64.7
Girl 60
Photography 59.5
Photo 59.5
Door 58
Skirt 57.7

Clarifai
created on 2023-10-25

people 99.9
adult 99.3
woman 97.9
group 97.9
man 97.1
group together 95.5
two 94.7
three 94.3
wear 93.5
four 93
monochrome 90.8
indoors 90.2
facial expression 90
child 89.7
medical practitioner 82.8
offspring 81.4
room 81.2
several 80.2
actor 79.5
uniform 78.5

Imagga
created on 2022-01-14

newspaper 55.8
product 42.2
people 36.2
person 35.9
man 32.9
creation 32.8
adult 28.2
male 22.7
office 22.7
women 20.6
business 20
businessman 19.4
happy 18.8
indoors 18.4
portrait 18.1
health 18.1
work 18
patient 17.6
room 17.6
men 17.2
smiling 16.6
lifestyle 16.6
medical 15.9
smile 15.7
nurse 14.7
modern 14
chair 13.8
waiter 13.8
clinic 13.8
team 13.4
pretty 13.3
interior 13.3
professional 13
worker 13
corporate 12.9
home 12.8
job 12.4
medicine 12.3
couple 12.2
businesswoman 11.8
hospital 11.7
shop 11.5
working 11.5
group 11.3
teamwork 11.1
dining-room attendant 11.1
20s 11
employee 10.9
clothing 10.9
talking 10.5
casual 10.2
two 10.2
life 10.2
inside 10.1
indoor 10
planner 9.8
building 9.8
two people 9.7
together 9.6
standing 9.6
cafeteria 9.5
meeting 9.4
hall 9.2
black 9
human 9
urban 8.7
negative 8.7
happiness 8.6
walking 8.5
desk 8.5
doctor 8.5
executive 8.4
manager 8.4
fashion 8.3
alone 8.2
lady 8.1
looking 8
film 7.8
face 7.8
check 7.7
attractive 7.7
exam 7.7
twenties 7.6
businesspeople 7.6
case 7.6
communication 7.6
house 7.5
technology 7.4
care 7.4
new 7.3
success 7.2
barbershop 7.2
computer 7.2
suit 7.2
day 7.1

Microsoft
created on 2022-01-14

person 96.1
clothing 96
text 95.5
dress 91.7
woman 89.6
smile 85.1
standing 83.7
footwear 75.9
posing 71.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 99.6%
Calm 45.7%
Angry 24.7%
Happy 17.8%
Surprised 4%
Disgusted 3.4%
Confused 2.2%
Sad 1.7%
Fear 0.6%

AWS Rekognition

Age 42-50
Gender Female, 94.1%
Happy 79.8%
Calm 11.9%
Confused 3%
Surprised 2%
Sad 0.9%
Disgusted 0.9%
Fear 0.7%
Angry 0.7%

AWS Rekognition

Age 31-41
Gender Male, 92.4%
Calm 99.5%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%
Sad 0%
Confused 0%
Fear 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%
Shoe 74.7%

Categories

Text analysis

Amazon

57854-A.
КАО
MJ17--YT37A°2 КАО
MJ17--YT37A°2

Google

57854-A. MJIA--YT3RA°2
57854-A.
MJIA--YT3RA°2