Human Generated Data

Title

Untitled (seated woman opening gift next to two standing women)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4650

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated woman opening gift next to two standing women)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4650

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.6
Person 93.6
Clothing 92.3
Apparel 92.3
Person 89
Female 87.7
Person 87.7
Person 77.3
Person 76.6
Dress 75.6
Floor 73
People 69.8
Girl 68.1
Woman 67.7
Sitting 64.7
Photography 62.6
Photo 62.6
Chair 62.5
Furniture 62.5
Drawing 61.8
Art 61.8
Hairdresser 61.7
Worker 61.7
Flooring 60.6
Suit 57
Coat 57
Overcoat 57
Kid 55.4
Child 55.4

Clarifai
created on 2023-10-15

people 99.5
monochrome 96.9
group 95.1
man 94.1
adult 93.6
woman 93.5
commerce 88.5
child 85
illustration 84.5
sit 82.6
family 81.4
chair 78.8
education 78.4
room 76.9
boy 74.2
group together 73.3
music 73.2
actor 73
many 72.9
crowd 72.2

Imagga
created on 2021-12-14

person 29.3
people 27.9
man 27.5
adult 25.8
teacher 23.4
male 23.4
chair 20.6
musical instrument 19.6
business 19.4
newspaper 18.7
businessman 18.5
product 17.6
smiling 17.4
lifestyle 17.3
sitting 17.2
room 16.9
professional 16.4
senior 15.9
happy 15
portrait 14.9
couple 14.8
wind instrument 14.2
indoors 14
educator 13.7
day 13.3
sibling 12.6
job 12.4
interior 12.4
holding 12.4
table 12.1
group 12.1
office 12
men 12
kin 12
creation 12
indoor 11.9
women 11.9
looking 11.2
home 11.2
clothing 11.1
accordion 10.9
pretty 10.5
old 10.4
dress 9.9
classroom 9.9
work 9.8
elderly 9.6
corporate 9.4
patient 9.4
mature 9.3
smile 9.3
care 9
fashion 9
seat 9
lady 8.9
family 8.9
medical 8.8
keyboard instrument 8.7
happiness 8.6
elegant 8.6
casual 8.5
two 8.5
manager 8.4
worker 8.4
floor 8.4
businesswoman 8.2
case 8.1
to 8
hair 7.9
together 7.9
hands 7.8
older 7.8
retired 7.8
class 7.7
attractive 7.7
health 7.6
child 7.5
meeting 7.5
house 7.5
human 7.5
clothes 7.5
mother 7.5
pensioner 7.4
technology 7.4
back 7.3
cheerful 7.3
alone 7.3
new 7.3
board 7.2
computer 7.2
team 7.2
holiday 7.2
face 7.1
nurse 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.1
drawing 91.2
black and white 84.2
sketch 79.3
person 71.1
clothing 66.7
old 47

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-58
Gender Male, 55.7%
Calm 86.7%
Sad 10.2%
Confused 1.3%
Happy 0.7%
Surprised 0.7%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Female, 64.2%
Calm 89.9%
Surprised 4.5%
Sad 2%
Angry 1.5%
Happy 1.3%
Confused 0.6%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 21-33
Gender Male, 56.5%
Happy 95.4%
Calm 2.3%
Surprised 1%
Confused 0.6%
Angry 0.3%
Sad 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 10-20
Gender Female, 71.3%
Calm 91.3%
Happy 5.1%
Sad 2.7%
Confused 0.6%
Surprised 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 35-51
Gender Female, 76.1%
Calm 52.4%
Happy 34.4%
Sad 8.4%
Confused 2.8%
Surprised 0.7%
Disgusted 0.5%
Angry 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Chair 62.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

VETERANS
WAR
VESS
25205
WORLD WAR VETERANS №1
WORLD
№1

Google

ONED WAR VETERANS NOI NE 2 5 20 S
ONED
VETERANS
NOI
5
WAR
NE
2
20
S