Human Generated Data

Title

Untitled (family picnic outside)

Date

1953

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17830

Human Generated Data

Title

Untitled (family picnic outside)

People

Artist: Lucian and Mary Brown, American

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 98
Person 98
Person 97.8
Person 97.7
Car 93.1
Automobile 93.1
Transportation 93.1
Vehicle 93.1
Person 91.8
Outdoors 88.7
Tree 82.8
Plant 82.8
Clothing 82.3
Apparel 82.3
Person 80.9
People 78.1
Nature 77.2
Vegetation 75.6
Yard 73
Female 72.3
Military 68.9
Meal 68.5
Food 68.5
Military Uniform 68
Bird 62.7
Animal 62.7
Girl 62.6
Grass 61.2
Photography 60.8
Photo 60.8
Art 60.7
Army 58.9
Armored 58.9

Imagga
created on 2022-02-26

gravestone 24.6
memorial 20.9
tree 20.7
stone 19.4
old 18.1
structure 17.9
landscape 17.9
outdoors 17.8
snow 17.5
wheeled vehicle 16.9
outdoor 16.1
winter 15.3
sky 15.3
forest 14.8
trees 14.2
rural 14.1
vehicle 14
people 13.9
park 13.2
tricycle 12.8
beach 12.2
man 12.1
travel 12
sand 11.4
cold 11.2
season 10.9
wood 10.8
male 10.7
summer 10.3
grass 10.3
mountain 10.1
child 10.1
house 10
danger 10
vintage 9.9
sunset 9.9
water 9.3
dirty 9
country 8.8
frozen 8.6
shovel 8.6
building 8.4
field 8.4
sport 8.2
protection 8.2
fall 8.1
conveyance 7.9
holiday 7.9
adult 7.8
snowy 7.8
outside 7.7
garden 7.7
frost 7.7
grunge 7.7
dirt 7.6
vacation 7.4
lake 7.3
peaceful 7.3
countryside 7.3
cemetery 7.2
road 7.2
autumn 7
architecture 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 98
grass 96.5
text 92.5
tree 62.2
black and white 57.6

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 54.2%
Calm 85.6%
Sad 5.4%
Happy 2.8%
Surprised 1.7%
Confused 1.4%
Angry 1.1%
Disgusted 1%
Fear 1%

AWS Rekognition

Age 38-46
Gender Male, 76.9%
Calm 99.8%
Sad 0.1%
Happy 0%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Car 93.1%
Bird 62.7%

Captions

Microsoft

a group of people in an old photo of a person 74%
a group of people that are standing in the grass 71.3%
a group of people in a field 71.2%

Text analysis

Amazon

21-276
0-757
0-757 i
i
NAGOX

Google

0-757
0-757