Human Generated Data

Title

Untitled (people in snow outside trailer)

Date

1958

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19826

Human Generated Data

Title

Untitled (people in snow outside trailer)

People

Artist: Ken Whitmire Associates, American

Date

1958

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.6
Person 99.6
Person 99.6
Person 99.4
Person 99.3
Clothing 83.9
Apparel 83.9
Outdoors 79.3
Nature 78
Face 77.6
Plant 71.8
Tree 71.8
People 71.7
Building 66.8
Bunker 66.8
Fashion 60.1
Robe 60.1
Person 59.5

Imagga
created on 2022-03-05

sky 26.8
landscape 26.8
travel 22.5
windshield 21.6
water 21.3
snow 18.1
mountain 17.2
tree 16.3
screen 16.2
ice 15.1
lake 14.7
sea 14.1
structure 13.9
protective covering 13.3
billboard 13.3
winter 12.8
night 12.4
vacation 12.3
holiday 12.2
light 12
clouds 11.8
scenery 11.7
dark 11.7
season 11.7
sunset 11.7
summer 11.6
river 11.6
forest 11.3
rock 11.3
mountains 11.1
art 11.1
island 11
negative 10.9
tourism 10.7
scenic 10.5
outdoors 10.4
cold 10.3
park 10.1
ocean 10
signboard 10
outdoor 9.9
trees 9.8
autumn 9.7
sun 9.7
fog 9.6
hill 9.4
natural 9.4
building 9.3
covering 9.1
fall 9
weather 8.8
film 8.8
scary 8.7
architecture 8.7
fear 8.7
cloud 8.6
sunrise 8.4
house 8.4
old 8.4
cool 8
stone 8
plastic bag 7.9
misty 7.9
iceberg 7.8
mist 7.7
beach 7.7
boat 7.5
environment 7.4
wall 7.3
morning 7.2
fantasy 7.2
coast 7.2
home 7.2
moon 7.1
rural 7
wooden 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.1
black and white 88.3
clothing 79
person 77.2
bedroom 46.8

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 88.7%
Calm 99.7%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Sad 0%
Surprised 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 23-31
Gender Male, 58.3%
Calm 96.9%
Sad 2%
Confused 0.4%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 45-53
Gender Female, 96.8%
Calm 46.7%
Sad 40.6%
Happy 9.2%
Confused 1.3%
Angry 0.8%
Disgusted 0.6%
Fear 0.5%
Surprised 0.3%

AWS Rekognition

Age 23-31
Gender Female, 68.2%
Happy 39.7%
Calm 33.3%
Sad 14.3%
Confused 9.2%
Angry 1.6%
Disgusted 0.6%
Surprised 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a person sitting on a bed 32.2%
a person with graffiti on the side of a bed 30%
a bedroom with graffiti 29.9%

Text analysis

Amazon

2