Human Generated Data

Title

Untitled (two women and a man performing at actors' theater)

Date

1969

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11384

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man performing at actors' theater)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11384

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Clothing 99.9
Apparel 99.9
Person 99.8
Human 99.8
Person 99.7
Person 99.4
Female 96.9
Dress 94.3
Shoe 93.7
Footwear 93.7
Woman 90.6
Shorts 90.3
Skirt 87.3
Chair 85.8
Furniture 85.8
Shoe 76.8
Suit 70.2
Overcoat 70.2
Coat 70.2
Indoors 66.8
Girl 63.6
Door 60
Sleeve 59.8
Flooring 59.6
Table 59
Room 58
Floor 57.2

Clarifai
created on 2023-10-25

people 99.9
adult 98.7
group 97.8
woman 97.8
man 96.9
group together 94.8
medical practitioner 94.3
three 91.5
child 91
four 90.7
uniform 90.6
two 87.9
wear 87.3
indoors 84.9
hospital 82
healthcare 81.2
leader 80.8
monochrome 80.1
actor 79.3
offspring 75.1

Imagga
created on 2022-01-14

person 31.9
people 30.1
adult 28.7
clothing 26.7
man 22.8
indoors 22
newspaper 21.9
smiling 20.2
happy 20
women 19.8
lifestyle 19.5
patient 19
health 18.1
product 17.9
interior 17.7
brassiere 16.4
male 16.3
hospital 15.6
portrait 15.5
pretty 15.4
home 15.2
attractive 14.7
garment 14.6
business 14.6
fashion 14.3
creation 14.1
modern 14
men 13.7
covering 13.7
casual 13.6
happiness 13.3
woman's clothing 13.2
undergarment 13.2
consumer goods 12.6
medical 12.4
life 12.3
inside 12
work 11.8
office 11.6
businessman 11.5
room 11.4
nurse 11.1
day 11
indoor 11
model 10.9
shop 10.8
worker 10.7
care 10.7
smile 10.7
medicine 10.6
lady 10.5
human 10.5
chair 10.4
sexy 10.4
sitting 10.3
20s 10.1
waiter 10
house 10
working 9.7
one 9.7
couple 9.6
corporate 9.4
window 9.4
doctor 9.4
equipment 9.4
holding 9.1
cheerful 8.9
looking 8.8
clinic 8.6
dining-room attendant 8.1
light 8
face 7.8
full length 7.8
case 7.7
elegance 7.6
fun 7.5
leisure 7.5
building 7.4
treatment 7.3
relaxing 7.3
businesswoman 7.3
dress 7.2
black 7.2
cute 7.2
kitchen 7.2
bright 7.1
posing 7.1
job 7.1
planner 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

clothing 96.8
dress 94.7
person 90.1
woman 89.6
text 88.8
smile 86.7
footwear 83.8
standing 83.6
posing 38.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 73.3%
Calm 95%
Happy 2.6%
Sad 1.4%
Disgusted 0.2%
Confused 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 33-41
Gender Female, 95.8%
Angry 67.9%
Calm 23.9%
Happy 2.8%
Disgusted 1.6%
Surprised 1.6%
Sad 0.9%
Confused 0.8%
Fear 0.5%

AWS Rekognition

Age 30-40
Gender Female, 60.7%
Calm 88.5%
Angry 8.3%
Disgusted 1.3%
Happy 1%
Surprised 0.2%
Fear 0.2%
Confused 0.2%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 93.7%

Categories

Text analysis

Amazon

57854.
MJIR--YTERA 57854.
MJIR--YTERA

Google

57854. MJI7--YT37.
57854.
MJI7--YT37.