Human Generated Data

Title

Untitled (family portrait in stable)

Date

1960

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18741

Human Generated Data

Title

Untitled (family portrait in stable)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Apparel 99.6
Clothing 99.6
Person 99.6
Nature 99
Outdoors 99
Rural 99
Building 99
Shelter 99
Countryside 99
Shorts 98.9
Person 98.4
Animal 98.2
Mammal 98.2
Horse 98.2
Female 97.2
Person 97
Face 95.7
Footwear 89.4
Shoe 89.4
Woman 89.1
Pet 87.4
Canine 87.4
Dog 87.4
Pants 83.3
Smile 82.7
Person 81.2
Urban 77.9
Street 77.9
Town 77.9
Road 77.9
City 77.9
Girl 73.1
Photography 70.8
Photo 70.8
Portrait 70.8
Indoors 69.5
Play 69.1
Kid 67.4
Child 67.4
Housing 67.1
Dress 65
Shoe 63.9
Person 63.3
Alley 63.2
Alleyway 63.2
Skirt 62.1
Shoe 60.3
Wood 59.7
Shoe 58.2
Room 57.2
Man 56
Tree 55.4
Plant 55.4
Denim 55.2
Jeans 55.2

Imagga
created on 2022-03-05

canvas tent 46.1
people 22.9
man 22.8
adult 16.3
travel 16.2
person 14.4
male 14.2
vacation 13.1
together 11.4
men 11.2
uniform 10.8
happy 10.6
outdoors 9.8
hospital 9.8
couple 9.6
outside 9.4
life 9.3
sport 9.1
summer 9
indoors 8.8
women 8.7
love 8.7
two 8.5
leisure 8.3
transport 8.2
passenger 8.1
transportation 8.1
activity 8.1
smiling 8
business 7.9
holiday 7.9
tent 7.8
portrait 7.8
old 7.7
clothing 7.6
wedding 7.4
work 7.3
dress 7.2
religion 7.2
medical 7.1
worker 7

Microsoft
created on 2022-03-05

outdoor 96.9
horse 95.6
text 90.6
black and white 87.9
clothing 87.2
person 84
standing 75.2
man 70.1

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 98.8%
Calm 97.9%
Sad 1.1%
Happy 0.3%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Calm 64.3%
Sad 10.7%
Angry 7.6%
Confused 5.3%
Happy 3.4%
Surprised 3.1%
Fear 3%
Disgusted 2.6%

AWS Rekognition

Age 33-41
Gender Male, 99.7%
Happy 85.3%
Surprised 6.8%
Calm 4.4%
Fear 1.1%
Sad 0.9%
Disgusted 0.6%
Confused 0.6%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Male, 97.3%
Surprised 48.9%
Calm 39.2%
Happy 3.8%
Disgusted 2.3%
Sad 1.9%
Confused 1.8%
Angry 1.2%
Fear 1%

AWS Rekognition

Age 38-46
Gender Male, 52.2%
Sad 77.5%
Calm 21.4%
Confused 0.6%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Surprised 0%

AWS Rekognition

Age 27-37
Gender Male, 96.2%
Calm 99.1%
Sad 0.6%
Happy 0.2%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Horse 98.2%
Shoe 89.4%
Dog 87.4%

Captions

Microsoft

a group of people standing next to a horse 83.2%
a man standing next to a horse 83.1%
a group of people standing in front of a building 73.1%

Text analysis

Amazon

FARM
LORRIE
TWIN-GATE FARM
TWIN-GATE
TUBILEE
CAMA
THIN-UTE
03248

Google

FAR
LORRIE TUELER TWIN GATE FAR TWIN CAM
TWIN
TUELER
LORRIE
GATE
CAM