Human Generated Data

Title

Untitled (fair grounds, man in Native American headdress driving pony cart with children)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18724

Human Generated Data

Title

Untitled (fair grounds, man in Native American headdress driving pony cart with children)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Grass 97.2
Plant 97.2
Human 96.9
Person 96.9
Animal 95.5
Mammal 95.5
Horse 95.5
Person 93
Person 93
Person 90.6
Chair 88.5
Furniture 88.5
Person 88.1
Outdoors 85.9
Horse 84.9
Person 84.2
Nature 83.1
Yard 81.1
People 78.8
Shorts 78.5
Apparel 78.5
Clothing 78.5
Person 77.8
Person 77.1
Person 76.9
Musician 74.7
Musical Instrument 74.7
Female 71.2
Person 70.7
Meal 68.4
Food 68.4
Field 67.7
Horse 64.5
Person 63.1
Kid 58.8
Girl 58.8
Blonde 58.8
Child 58.8
Teen 58.8
Woman 58.8
Crowd 58.4
Leisure Activities 56.8
Music Band 56.3
Person 53.1
Person 50.5
Person 42.5

Imagga
created on 2022-03-05

plow 35.8
farm 32.1
rural 30
horse 27.4
tool 26
grass 24.5
field 20.9
ranch 20.8
cattle 20.7
landscape 19.3
negative 18.3
fence 18
pasture 17.2
cow 16.5
horses 15.6
sky 15.4
livestock 14.9
film 14.5
dairy 14.2
country 14
agriculture 14
countryside 13.7
equine 13.7
outdoors 13.4
meadow 12.6
brown 12.5
outdoor 12.2
resort 11.9
bovine 11.7
tree 11.5
hay 11.5
snow 11.4
photographic paper 11.2
winter 11.1
summer 10.9
deer 10.8
farming 10.4
ox 10.4
animals 10.2
animal 10.2
scenery 9.9
mare 9.8
herd 9.8
cart 9.7
wild 9.6
mammal 9.4
ride 9.4
sand 9.4
trees 8.9
cows 8.9
sun 8.9
grazing 8.8
house 8.4
carriage 8.1
scenic 7.9
stable 7.9
stallion 7.8
mane 7.8
photographic equipment 7.4
environment 7.4
wagon 7.3
domestic 7.2
black 7.2
wildlife 7.1
mountain 7.1
day 7.1
travel 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

grass 99.8
horse 96.9
outdoor 95.1
text 90.2
old 83
black 69.8
white 63.1
mammal 53.3
vintage 26.2

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 96.2%
Disgusted 32.7%
Sad 25.1%
Calm 14.8%
Angry 10.6%
Fear 7.3%
Happy 4%
Surprised 3.9%
Confused 1.6%

AWS Rekognition

Age 2-10
Gender Male, 99.7%
Calm 77.1%
Confused 12.1%
Surprised 7.1%
Happy 1.3%
Disgusted 1%
Angry 0.9%
Sad 0.5%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Female, 53.4%
Sad 60.7%
Disgusted 12.9%
Confused 9%
Calm 6.3%
Surprised 4.8%
Angry 4.3%
Happy 1.2%
Fear 0.9%

AWS Rekognition

Age 25-35
Gender Female, 99.2%
Disgusted 64.9%
Calm 21.5%
Happy 4.6%
Sad 4.2%
Fear 2.1%
Angry 1.3%
Confused 0.9%
Surprised 0.5%

AWS Rekognition

Age 21-29
Gender Female, 94.1%
Sad 63.5%
Calm 20.7%
Confused 8.6%
Fear 2.5%
Disgusted 1.7%
Angry 1.4%
Happy 1.1%
Surprised 0.4%

Feature analysis

Amazon

Person 96.9%
Horse 95.5%

Captions

Microsoft

a vintage photo of a group of people standing next to a horse 91.7%
a vintage photo of a horse 91.6%
a vintage photo of a group of people standing on top of a horse 89.4%

Text analysis

Amazon

E
MJI7
E MJI7 YT37AS
YT37AS