Human Generated Data

Title

Untitled (two women posed with quilt, wedding and portrait photographs on wall)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18723

Human Generated Data

Title

Untitled (two women posed with quilt, wedding and portrait photographs on wall)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.7
Human 98.7
Person 97.2
Furniture 93.1
Clothing 74.1
Apparel 74.1
Face 70.4
Shoe 68.8
Footwear 68.8
Person 65.7
Female 64.8
People 64.5
Overcoat 62.1
Coat 62.1
Indoors 60.6
Shorts 60
Bag 57.6
Door 55.9
Suit 55.4

Imagga
created on 2022-03-05

accordion 100
keyboard instrument 100
musical instrument 100
wind instrument 81.8
man 28.2
male 23.4
person 22.3
people 21.7
adult 19.6
happy 18.2
handcart 16.3
happiness 14.1
portrait 13.6
smile 13.5
wheeled vehicle 13
smiling 13
lifestyle 13
child 12.9
holding 12.4
barrow 12.2
cheerful 11.4
couple 11.3
fashion 10.6
old 10.4
outdoors 10.4
sexy 10.4
music 10.4
women 10.3
day 10.2
casual 10.2
chair 10.1
holiday 10
joy 10
outdoor 9.9
park 9.9
clothing 9.8
family 9.8
business 9.7
grass 9.5
play 9.5
sitting 9.4
musician 9
summer 9
active 9
shopping cart 9
fun 9
kid 8.9
boy 8.7
cute 8.6
work 8.4
executive 8.3
shopping 8.3
hat 8.2
playing 8.2
mother 8.1
job 8
working 8
businessman 7.9
standing 7.8
men 7.7
attractive 7.7
father 7.7
youth 7.7
two 7.6
clothes 7.5
city 7.5
style 7.4
teen 7.3
lady 7.3
present 7.3
black 7.2
suit 7.2
blond 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

furniture 97.1
chair 91.2
text 91
clothing 90.4
person 90.4
table 90.2
black and white 90.1
man 74.9
footwear 64.5

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 93.9%
Surprised 54.3%
Happy 42.6%
Disgusted 0.9%
Confused 0.7%
Calm 0.6%
Fear 0.4%
Angry 0.2%
Sad 0.2%

AWS Rekognition

Age 50-58
Gender Male, 99.6%
Happy 87.6%
Sad 7.7%
Confused 1.4%
Calm 1.1%
Surprised 1%
Disgusted 0.6%
Angry 0.3%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Shoe 68.8%

Captions

Microsoft

a person sitting in a chair 74.9%
a person sitting on a table 54.2%
a person sitting at a table 54.1%

Text analysis

Amazon

10
SVEETA
бигсо SVEETA EIT by 10
EIT
by
бигсо