Human Generated Data

Title

Untitled (three children in striped shirts posed with dog and goat outdoors)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9846

Human Generated Data

Title

Untitled (three children in striped shirts posed with dog and goat outdoors)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9846

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.8
Human 99.8
Person 94.6
Tree 92.7
Plant 92.7
Dog 91.3
Canine 91.3
Pet 91.3
Animal 91.3
Mammal 91.3
Vegetation 91.2
Clothing 86.4
Apparel 86.4
Outdoors 84.2
Dog 83.5
Ground 78.6
Woodland 78.4
Forest 78.4
Nature 78.4
Land 78.4
Face 77.6
Female 68.1
Puppy 66.6
Person 66.2
Furniture 65.9
Photography 61.2
Photo 61.2
Smile 59.4
Grass 58.5
Girl 56.9
Hound 56.7
Husky 56.1

Clarifai
created on 2023-10-27

people 99.8
adult 98.8
canine 98.2
group together 97.6
wear 97.5
dog 97.5
child 97.4
group 96.9
administration 96.8
man 95.8
three 95.1
recreation 95
war 93.5
two 92.7
woman 92.4
military 91.3
campsite 90.6
four 90.1
sibling 89.3
monochrome 88.9

Imagga
created on 2022-01-28

snow 32.9
winter 26.4
forest 24.4
tree 24.3
cold 23.3
child 23.1
outdoor 21.4
landscape 20.8
season 17.9
park 17.3
mechanical device 16.9
outdoors 16.8
dog 16.8
people 16.7
rural 15.9
man 14.8
country 14.1
male 13.8
countryside 13.7
fun 13.5
grass 13.4
frost 13.4
weather 13.4
walk 13.3
sky 12.8
fall 12.7
snowy 12.6
mechanism 12.6
trees 12.5
frozen 12.4
scene 12.1
wood 11.7
shovel 11.4
walking 11.4
sprinkler 11.1
kin 11.1
ice 11.1
sport 11
woods 10.5
summer 10.3
horse 10
swing 9.9
hunting dog 9.7
old 9.1
scenery 9
hound 9
autumn 8.8
covered 8.7
couple 8.7
water 8.7
day 8.6
path 8.5
father 8.4
beach 8.4
adult 8.4
field 8.4
barrow 8.3
vacation 8.2
happy 8.1
road 8.1
tool 8.1
sunset 8.1
recreation 8.1
dad 8
plaything 8
snowing 7.9
seasonal 7.9
person 7.8
hand tool 7.8
men 7.7
outside 7.7
joy 7.5
canine 7.5
brown 7.4
parent 7.4
peaceful 7.3
sun 7.2
domestic animal 7.2
lifestyle 7.2
sand 7.2
farm 7.1
mountain 7.1
portrait 7.1
kid 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

tree 99.8
outdoor 99.8
text 83.9
snow 83.5
person 75.2
black and white 73.6
clothing 71
grave 62.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 98.2%
Calm 95%
Sad 3.4%
Happy 0.6%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Person 99.8%
Person 94.6%
Person 66.2%
Dog 91.3%
Dog 83.5%

Captions

Microsoft
created on 2022-01-28

a man riding a horse 76.3%
a man holding a dog 49.2%
a man riding a horse in a field 49.1%

Text analysis

Google

MJI7--YT37A°2 - - AGOX
MJI7--YT37A°2
-
AGOX