Human Generated Data

Title

Untitled (ten people posed on horseback with one man crouching in field)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6260

Human Generated Data

Title

Untitled (ten people posed on horseback with one man crouching in field)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6260

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 98.3
People 97.1
Field 95.5
Horse 95
Animal 95
Mammal 95
Sport 93.1
Sports 93.1
Horse 90.8
Horse 90.7
Person 90.1
Horse 87.3
Shorts 85.2
Clothing 85.2
Apparel 85.2
Horse 84.2
Person 84
Team Sport 83
Team 83
Person 81.5
Horse 78.6
Football 76.4
Person 75.8
Person 73.8
Person 73.1
Person 67.6
Person 66.1
Grass 64
Plant 64
Cricket 59.8
Stadium 59.2
Building 59.2
Arena 59.2
Croquet 57.5
Person 51.5

Clarifai
created on 2023-10-26

people 99.8
adult 98
man 97.9
group 97.9
many 97
wear 96.2
group together 95.5
art 91.3
mammal 90.5
canine 80.7
woman 76.6
cavalry 73.6
outfit 72.7
dog 72.3
illustration 71.8
war 70.9
uniform 70.6
dancer 69.5
education 68.3
child 66.5

Imagga
created on 2022-01-22

sky 23.8
landscape 21.6
farm 19.6
rural 17.6
travel 16.9
outdoor 16.8
cattle 16.5
sea 16.4
water 16
cow 15.9
grass 15.8
beach 15.7
group 15.3
plastic bag 14.8
summer 14.8
sand 14.7
tourism 14
herd 13.7
outdoors 13.7
coast 13.5
shore 13.1
vacation 13.1
animals 13
bag 12.6
livestock 12.5
ocean 12.4
country 12.3
agriculture 12.3
spectator 12.2
natural 12
mountain 11.6
man 11.4
people 11.2
outside 11.1
countryside 11
scenic 10.5
sun 10.5
field 10
ranch 10
sunset 9.9
park 9.9
swimming trunks 9.6
dairy 9.6
seascape 9.6
person 9.5
male 9.3
lake 9.2
silhouette 9.1
bovine 9
afghan hound 9
meadow 9
swimsuit 8.9
sheep 8.9
container 8.7
standing 8.7
sunny 8.6
farming 8.5
garment 8.5
coastline 8.5
clouds 8.4
scenery 8.1
clothing 8.1
cows 7.9
holiday 7.9
flock 7.9
resort 7.8
rock 7.8
men 7.7
hound 7.6
leisure 7.5
evening 7.5
waves 7.4
environment 7.4
spring 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

outdoor 99
text 94
white 90.2
person 89.4
black 88.4
old 76.5
black and white 75.5
mammal 62.9
man 50.8
horse 21.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Female, 95.8%
Fear 93.9%
Surprised 2.5%
Calm 2.2%
Sad 0.4%
Happy 0.3%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 33-41
Gender Male, 96.7%
Sad 57.7%
Calm 33.4%
Confused 5.3%
Angry 1.7%
Happy 0.6%
Surprised 0.5%
Disgusted 0.4%
Fear 0.3%

Feature analysis

Amazon

Horse 95%
Person 90.1%

Text analysis

Amazon

VIEEA2
هولا