Human Generated Data

Title

Untitled (man and woman posed in front of lace curtained window, cast iron radiator)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18635

Human Generated Data

Title

Untitled (man and woman posed in front of lace curtained window, cast iron radiator)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.7
Person 99.7
Clothing 99.6
Apparel 99.6
Person 99.1
Chair 94.4
Furniture 94.4
Female 93.8
Dress 87.4
Sleeve 86.6
Sitting 85.5
Flooring 84.2
Woman 82.9
Home Decor 79.3
Floor 76.3
Shorts 74.8
Face 70.3
Long Sleeve 65.6
Photography 65.2
Portrait 65.2
Photo 65.2
Girl 62.2
Door 61.2
Pants 59.1
Coat 56.3
Overcoat 56.3

Imagga
created on 2022-03-05

man 32.3
people 31.8
person 26.2
male 23.7
adult 22.9
professional 22.1
crutch 20.4
business 19.4
businessman 18.5
portrait 18.1
men 17.2
women 16.6
happy 16.3
staff 15.8
suit 14.8
work 14.2
teacher 14.1
worker 14
smile 13.5
couple 13.1
lifestyle 13
corporate 12.9
two 12.7
happiness 12.5
stick 12.4
holding 12.4
walking 12.3
smiling 12.3
standing 12.2
office 11.4
executive 11.3
pretty 11.2
wall 11.1
day 11
educator 10.8
active 10.8
success 10.5
outdoors 10.4
life 10.4
looking 10.4
casual 10.2
street 10.1
room 10.1
fashion 9.8
attractive 9.8
old 9.8
lady 9.7
one 9.7
pedestrian 9.4
building 9.2
city 9.1
businesswoman 9.1
dress 9
domestic 9
summer 9
human 9
job 8.9
walk 8.6
model 8.6
youth 8.5
black 8.4
modern 8.4
communication 8.4
health 8.3
alone 8.2
exercise 8.2
chair 8.1
family 8
love 7.9
together 7.9
full length 7.8
run 7.7
outdoor 7.6
finance 7.6
leisure 7.5
future 7.4
fit 7.4
cheerful 7.3
groom 7.3
sport 7.3
fitness 7.2
handsome 7.1
interior 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 93.9
footwear 92
clothing 91.1
outdoor 87.4
person 86.2
black and white 74

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.4%
Surprised 31.5%
Calm 30.8%
Happy 23.7%
Sad 5.8%
Disgusted 4.6%
Confused 2.4%
Angry 0.9%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Sad 77.2%
Surprised 11.8%
Angry 5.2%
Happy 1.9%
Fear 1.4%
Disgusted 1.1%
Confused 0.8%
Calm 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a building 72.1%
a man standing next to a building 68.8%
a man holding a suitcase 45.1%

Text analysis

Amazon

MJ17--YT37AS--XAg

Google

MJI7--YT3RA°2--
3009
5OM
Gr
MJI7--YT3RA°2-- 3009 Gr 5OM