Human Generated Data

Title

Untitled (group of men with two horses at Quarterhorse Show)

Date

1947

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2680

Human Generated Data

Title

Untitled (group of men with two horses at Quarterhorse Show)

People

Artist: Harry Annas, American 1897 - 1980

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2680

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.7
Human 99.7
Person 99.6
Person 99.6
Person 99.5
Person 99.4
Horse 97.8
Animal 97.8
Mammal 97.8
Person 97.7
Person 97.4
Person 96.3
Clothing 95.5
Apparel 95.5
Shorts 95.1
Person 94
People 85.7
Person 82.4
Suit 71.3
Coat 71.3
Overcoat 71.3
Female 58.1

Clarifai
created on 2023-10-26

people 99.8
many 97.9
group together 97.6
group 95.8
child 94.5
wear 94
adult 93.8
cavalry 93.8
uniform 92
man 91.2
crowd 87.9
boy 83.7
audience 82.4
outfit 82.3
mammal 81.9
administration 77.2
athlete 76.8
spectator 71.8
woman 70.7
music 70.6

Imagga
created on 2022-01-16

dairy 100
farm 40.2
rural 39.7
cow 38.9
cattle 34.8
ranch 32.9
field 28.5
grass 27.7
landscape 27.5
livestock 26
country 23.7
herd 20.6
agriculture 19.3
pasture 19.2
sky 19.1
meadow 18.8
animals 18.5
farming 18
cows 16.8
bovine 16.1
countryside 15.5
grazing 14.7
summer 14.2
brown 14
fence 13.9
scene 13.9
tree 13.8
outdoors 13.4
horse 13.4
outdoor 13
group 12.9
land 12.4
milk 12.4
scenic 12.3
black 12
horses 11.7
mountain 11.6
environment 11.5
pen 10.9
graze 10.8
scenery 10.8
sun 10.5
hay 10
trees 9.8
farmland 9.7
fields 9.7
enclosure 9.1
equine 9.1
horizon 9
sheep 9
mare 8.8
beef 8.6
travel 8.5
water 8
bull 7.9
wild 7.8
sunny 7.7
outside 7.7
head 7.6
mountains 7.4
structure 7.2
meat 7.2
sunset 7.2

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

text 99.5
horse 86.8
standing 86.2
white 77.7
old 75.8
black 72.7
posing 45.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 87.7%
Happy 66%
Sad 27.9%
Calm 1.5%
Fear 1.3%
Angry 0.9%
Disgusted 0.9%
Confused 0.8%
Surprised 0.6%

AWS Rekognition

Age 45-53
Gender Male, 99.7%
Calm 95.7%
Confused 1.7%
Happy 1.2%
Sad 0.9%
Surprised 0.2%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 91.5%
Sad 5.4%
Happy 1.9%
Confused 0.4%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Female, 72%
Calm 99%
Sad 0.3%
Angry 0.2%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-30
Gender Female, 75.6%
Calm 71.9%
Happy 23.1%
Sad 1.3%
Disgusted 1.2%
Surprised 0.8%
Angry 0.7%
Confused 0.6%
Fear 0.3%

AWS Rekognition

Age 18-24
Gender Male, 83.9%
Calm 67.4%
Sad 10.1%
Angry 9.6%
Fear 4.2%
Confused 3.7%
Disgusted 2.5%
Surprised 1.4%
Happy 1%

AWS Rekognition

Age 26-36
Gender Male, 97.3%
Sad 66.6%
Calm 12.7%
Happy 6.6%
Disgusted 4.1%
Confused 3.9%
Angry 2.7%
Surprised 2.3%
Fear 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Horse 97.8%

Categories

Text analysis

Amazon

J33
KODAK-SVEELA