Human Generated Data

Title

Untitled (women and children next to shack)

Date

1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18185

Human Generated Data

Title

Untitled (women and children next to shack)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Nature 99.5
Shelter 99.5
Outdoors 99.5
Rural 99.5
Building 99.5
Countryside 99.5
Urban 93.9
Person 91.5
Human 91.5
Hut 88.4
Shack 85.9
Person 83.2
Housing 80.3
Face 71.7
Person 66.6
Person 66.2
People 65.9
Person 60.4
House 59.2

Imagga
created on 2022-03-04

wheeled vehicle 32.9
structure 28.2
mobile home 24.4
vehicle 22.8
old 22.3
building 21.1
trailer 19.5
housing 19.3
cemetery 16.5
house 15.9
landscape 14.9
sky 14.7
stone 12.9
dirty 12.6
conveyance 12.4
tree 12.4
vintage 11.6
memorial 11
destruction 10.7
forest 10.4
gravestone 10.4
handcart 10.3
dark 10
trees 9.8
abandoned 9.8
rural 9.7
fog 9.6
grunge 9.4
outdoor 9.2
travel 9.1
environment 9
accident 8.8
broken 8.7
antique 8.7
scene 8.7
architecture 8.6
car 8.4
park 8.2
protection 8.2
danger 8.2
history 8
mountain 8
home 8
wall 7.9
autumn 7.9
barrow 7.9
urban 7.9
black 7.8
spooky 7.8
season 7.8
empty 7.7
death 7.7
summer 7.7
old fashioned 7.6
tricycle 7.5
outdoors 7.5
light 7.4
industrial 7.3
barbershop 7.3
holiday 7.2
scenic 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

grass 99.6
outdoor 99.4
black and white 91.3
black 73.7
text 68
old 62.1
person 60.1
house 58.5

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 90.2%
Happy 61.3%
Surprised 29.3%
Fear 3.9%
Calm 2%
Angry 1.3%
Disgusted 1.3%
Sad 0.7%
Confused 0.2%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 92.9%
Happy 3.3%
Sad 1.7%
Angry 1.4%
Confused 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 99.3%
Angry 43.1%
Calm 23.2%
Fear 15.5%
Happy 10.6%
Confused 3.1%
Surprised 1.9%
Disgusted 1.6%
Sad 0.9%

AWS Rekognition

Age 19-27
Gender Male, 99.3%
Calm 80.7%
Fear 9.7%
Sad 3.6%
Happy 3.2%
Angry 1%
Surprised 0.7%
Confused 0.6%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 91.5%

Captions

Microsoft

a vintage photo of a group of people sitting in front of a house 83.9%
a vintage photo of a group of people standing in front of a house 83.8%
a vintage photo of a group of people sitting in front of a building 77.6%

Text analysis

Amazon

PIS
<<<<<<<<<<