Human Generated Data

Title

Untitled (three children posed with goat and dog outdoors)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9836

Human Generated Data

Title

Untitled (three children posed with goat and dog outdoors)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Vegetation 99.8
Plant 99.8
Human 99.8
Person 99.8
Nature 99
Woodland 98.9
Outdoors 98.9
Land 98.9
Forest 98.9
Tree 98.9
Yard 98.3
Grove 98.1
Person 96.7
Shorts 95.7
Clothing 95.7
Apparel 95.7
Face 94.7
Dog 93.7
Animal 93.7
Pet 93.7
Mammal 93.7
Canine 93.7
Grass 93.2
Smile 91.3
Person 90.8
Shelter 87.4
Rural 87.4
Countryside 87.4
Building 87.4
Play 83.7
Ground 74.5
Park 72.6
Lawn 72.6
Kid 72.4
Child 72.4
Furniture 68.9
Female 68.4
Portrait 68.2
Photography 68.2
Photo 68.2
People 67.8
Field 64.8
Ice 63.3
Boy 58
Sand 57.3
Housing 55.9
Standing 55.2
Person 42.9

Imagga
created on 2022-01-28

swing 100
plaything 100
mechanical device 100
mechanism 87.8
snow 61.4
winter 51.1
cold 43.9
forest 41.8
tree 36.5
park 33.8
landscape 33.5
frost 30.7
trees 28.5
season 25.7
snowy 24.3
ice 22.6
weather 22.4
outdoor 22.2
branch 18.2
frozen 18.2
snowfall 17.7
scene 17.3
freeze 16.5
wood 15.9
covered 15.5
walk 15.2
seasonal 14.9
frosty 14.7
outdoors 14.5
scenery 14.4
woods 14.3
sky 13.4
light 13.4
mountain 13.4
river 12.5
rural 12.3
country 12.3
path 12.3
peaceful 11.9
fall 11.8
road 11.7
water 11.4
snowing 10.8
recreation 10.8
autumn 10.5
scenic 10.5
walking 10.4
countryside 10.1
fun 9.7
hiking 9.6
sport 9.6
people 9.5
wilderness 9.4
old 9.1
natural 8.7
day 8.6
travel 8.5
sunset 8.1
cool 8
chill 7.9
serenity 7.8
trail 7.7
fog 7.7
pine 7.7
climate 7.6
environment 7.4
man 7.4
vacation 7.4
tourist 7.3
sun 7.3
morning 7.2
holiday 7.2
activity 7.2

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

outdoor 98.4
snow 92
text 87
black and white 84.7
tree 76.8
person 76.6
clothing 69.8
monochrome 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 94.6%
Fear 64.9%
Calm 17.2%
Sad 12.7%
Angry 1.3%
Surprised 1.2%
Disgusted 0.9%
Confused 0.9%
Happy 0.8%

AWS Rekognition

Age 23-31
Gender Male, 95.8%
Sad 39.3%
Happy 36.2%
Calm 21.3%
Angry 0.8%
Disgusted 0.7%
Confused 0.7%
Surprised 0.5%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Dog 93.7%

Captions

Microsoft

a person riding a horse 65.7%
a group of people riding on the back of a horse 60.4%
a person riding on the back of a horse 60.3%

Text analysis

Amazon

YT37A
M+17-
A.A.O
M+17- YT37A THE A.A.O
THE