Human Generated Data

Title

Untitled (football team posing in position on field with hills in background)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6622

Human Generated Data

Title

Untitled (football team posing in position on field with hills in background)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6622

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 97
Person 93.3
Person 92.8
Person 91.7
Person 90.4
Mammal 84.6
Animal 84.6
Person 80.2
Horse 79.4
Person 79.4
Wildlife 78.5
Deer 74.9
Person 74.5
Elk 69
Hunting 66.7
Person 53.7

Clarifai
created on 2019-03-25

people 99.5
many 97.3
man 96.9
group 96.6
group together 96.2
cavalry 92.3
crowd 88.3
adult 87.9
military 87.7
war 79.1
illustration 76.8
woman 75.6
child 75.6
nature 75.3
recreation 74.4
army 73.9
mammal 73.9
dog 73.5
skirmish 73.2
family 72.5

Imagga
created on 2019-03-25

camel 56.1
grass 30.1
rural 30
farm 27.7
horse 25.5
ungulate 23.8
ranch 23.7
field 23.4
country 22.8
outdoor 21.4
animals 21.3
sky 20.6
landscape 18.6
mountain 18
horses 17.5
pasture 17.3
livestock 17.1
travel 16.9
outdoors 16.8
summer 16.7
herd 16.7
cow 15.9
cattle 15.9
mountains 15.8
wild 15.7
meadow 15.3
countryside 14.6
grazing 13.7
land 13.6
desert 13.3
sand 13
hill 12.2
steppe 12.2
dairy 12.1
hound 11.9
brown 11.8
dog 11.6
plain 11.6
agriculture 11.4
sun 11.3
bovine 11.3
slope 11.2
domestic 10.9
tourism 10.7
running 10.6
standing 10.4
ascent 10.4
clouds 10.1
people 10
sheep 9.9
equine 9.8
stallion 9.8
group 9.7
cowboy 9.6
farming 9.5
tree 9.2
sunset 9
cows 8.9
riding 8.8
scenic 8.8
sport 8.4
freedom 8.2
beach 8.2
environment 8.2
scenery 8.1
man 8.1
wilderness 7.9
hunting dog 7.9
mane 7.8
sea 7.8
men 7.7
run 7.7
male 7.7
outside 7.7
sunrise 7.5
dry 7.4
wildlife 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

grass 98.9
outdoor 98.7
animal 98.7
wildlife 78.5
nature 56.5
landscape 48.9
black and white 46
fall 41
infrared 32.8
deer 31

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 53.2%
Disgusted 45.5%
Confused 45.2%
Surprised 45.2%
Angry 46%
Happy 45.3%
Sad 51.6%
Calm 46.3%

AWS Rekognition

Age 20-38
Gender Female, 51.7%
Sad 53.7%
Confused 45.3%
Disgusted 45.1%
Calm 45.6%
Happy 45.1%
Angry 45.2%
Surprised 45.1%

AWS Rekognition

Age 11-18
Gender Male, 53.6%
Sad 45.4%
Disgusted 45.1%
Happy 45.1%
Calm 54%
Angry 45.2%
Confused 45.2%
Surprised 45.1%

AWS Rekognition

Age 45-63
Gender Male, 53.3%
Happy 45.1%
Angry 45.8%
Sad 46.3%
Confused 45.5%
Disgusted 45.8%
Calm 51.3%
Surprised 45.2%

AWS Rekognition

Age 35-52
Gender Female, 54%
Sad 48.4%
Happy 45.2%
Confused 45.3%
Surprised 45.2%
Disgusted 45.5%
Calm 49.9%
Angry 45.5%

AWS Rekognition

Age 23-38
Gender Female, 51.9%
Sad 45.3%
Angry 45.2%
Surprised 45.1%
Happy 45.4%
Calm 53.7%
Confused 45.1%
Disgusted 45.2%

AWS Rekognition

Age 20-38
Gender Female, 54.4%
Confused 45.1%
Angry 45.1%
Surprised 45.1%
Sad 45.4%
Happy 45.1%
Calm 54.1%
Disgusted 45.1%

AWS Rekognition

Age 35-55
Gender Female, 51.9%
Sad 47.6%
Happy 45.1%
Angry 45.2%
Disgusted 45.3%
Confused 45.1%
Calm 51.6%
Surprised 45.1%

AWS Rekognition

Age 35-55
Gender Male, 50.2%
Disgusted 48%
Confused 45.4%
Surprised 45.5%
Angry 46.1%
Happy 46.2%
Sad 48%
Calm 45.8%

AWS Rekognition

Age 26-43
Gender Female, 50.9%
Happy 45.1%
Calm 52.7%
Sad 46.8%
Angry 45.2%
Confused 45.1%
Surprised 45.1%
Disgusted 45%

Feature analysis

Amazon

Person 93.3%
Horse 79.4%

Categories

Captions

Text analysis

Amazon

YT3A8
XAO
8