Human Generated Data

Title

Untitled (older man and woman sitting in front of piano)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18181

Human Generated Data

Title

Untitled (older man and woman sitting in front of piano)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Furniture 100
Chair 100
Apparel 99.8
Clothing 99.8
Human 99.6
Person 99.6
Person 99.3
Overcoat 97.3
Coat 97.3
Suit 97.3
Home Decor 82.5
Face 82.1
Tuxedo 81.3
Female 78.7
Portrait 71.2
Photography 71.2
Photo 71.2
Couch 70.1
Plant 70
Woman 61.7
Indoors 61.4
Table Lamp 59.5
Lamp 59.5
Dress 59.3
Living Room 57.2
Room 57.2
Shorts 56.4
Man 55.3

Imagga
created on 2022-03-04

man 34.3
person 31.3
male 28.4
people 27.3
home 23.1
adult 20.9
teacher 20.4
senior 19.7
men 18.9
room 18.8
professional 16.8
sitting 15.5
chair 15.2
smiling 15.2
indoors 14.9
mature 14.9
old 14.6
grandfather 14.5
educator 14.3
family 14.2
mother 13.4
happy 13.2
retirement 12.5
working 12.4
portrait 12.3
computer 12
elderly 11.5
indoor 10.9
laptop 10.9
lifestyle 10.8
smile 10.7
bass 10.6
love 10.3
black 10.2
scholar 10.2
alone 10
holding 9.9
hand 9.9
businessman 9.7
retired 9.7
sax 9.5
pensioner 9.4
casual 9.3
back 9.2
occupation 9.2
business 9.1
work 8.9
father 8.9
medical 8.8
older 8.7
couple 8.7
women 8.7
stringed instrument 8.6
child 8.6
musical instrument 8.5
patient 8.4
health 8.3
care 8.2
music 8.2
aged 8.1
intellectual 8.1
office 8
parent 8
looking 8
interior 8
happiness 7.8
seat 7.8
play 7.8
crutch 7.7
resting 7.6
life 7.6
relaxed 7.5
one 7.5
vintage 7.4
kin 7.4
lady 7.3
handsome 7.1
job 7.1
hospital 7.1
bowed stringed instrument 7
together 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

person 96.3
text 95.4
clothing 90.3
man 77.7

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 98.7%
Calm 78%
Happy 19.6%
Confused 0.9%
Sad 0.6%
Surprised 0.4%
Disgusted 0.3%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man holding a sign 74.6%
a person posing for the camera 74.5%
a man posing for a picture 74.4%

Text analysis

Amazon

TEA
KODVK-8VLELA

Google

YT3RA8-
YT3RA8-