Human Generated Data

Title

Untitled (two men holding turkeys)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2627

Human Generated Data

Title

Untitled (two men holding turkeys)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2627

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 96.3
Helmet 65.3
Clothing 65.3
Apparel 65.3
Animal 64.6
Face 64.3
Shorts 63.8
Outdoors 61
Mammal 59.7
Rodeo 59.3
Bullfighter 56.8
Hunting 56.7

Clarifai
created on 2023-10-26

people 99.9
group together 98.3
group 98.1
adult 96.3
veil 95
wear 94.4
man 94.3
recreation 93.2
three 91
vehicle 90.1
child 88.9
outfit 88.3
two 87.8
four 84.1
boy 84
canine 83.9
uniform 83
music 82.1
monochrome 78.5
cavalry 77.8

Imagga
created on 2022-01-15

rope 36.5
line 22.3
shield 21.5
armor 18.8
man 18.1
travel 17.6
sand 17.4
sky 16.7
outdoors 15.7
beach 15.7
percussion instrument 15.3
person 14.5
water 14
desert 14
summer 13.5
protective covering 13.4
people 13.4
child 13.4
sport 13.2
vacation 13.1
musical instrument 12.9
drum 12.7
holiday 12.2
male 12.1
building 12
adult 11.7
sea 11.7
tourism 11.5
outdoor 11.5
camel 11.1
two 11
tourist 10.6
statue 10.5
hat 10.2
old 9.7
sun 9.7
covering 9.2
ocean 9.1
protection 9.1
horse 8.8
cowboy 8.7
sunny 8.6
clouds 8.4
dry 8.4
fun 8.2
landscape 8.2
dirty 8.1
history 8
seller 8
ride 8
culture 7.7
dirt 7.6
traditional 7.5
city 7.5
animal 7.4
fountain 7.4
competition 7.3
playing 7.3
danger 7.3
coast 7.2
day 7.1
architecture 7

Microsoft
created on 2022-01-15

text 98.6
outdoor 97.7
black and white 62.3
posing 56.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 99.6%
Calm 95.3%
Happy 2.2%
Fear 0.6%
Sad 0.5%
Disgusted 0.5%
Surprised 0.4%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 48-54
Gender Male, 97.8%
Calm 98.3%
Confused 0.5%
Sad 0.3%
Happy 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Helmet 65.3%

Categories

Imagga

paintings art 99.5%

Text analysis

Google

YT3RA2-A
YT3RA2-A