Human Generated Data

Title

Untitled (four men playing golf)

Date

1947

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21714

Human Generated Data

Title

Untitled (four men playing golf)

People

Artist: John Howell, American active 1930s-1960s

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21714

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.8
Human 99.8
Person 99.7
Person 99.6
Sport 96.1
Sports 96.1
Golf 92.4
Golf Club 90.5
Clothing 76.5
Apparel 76.5
Person 74.9
Putter 65.7
Hat 60.1
Person 55.8

Clarifai
created on 2023-10-22

people 99.9
golfer 99.7
group together 99.2
golf club 98.9
adult 97.2
golf 97
group 96.6
man 96.4
three 95.5
sports equipment 95.4
four 95.4
two 93.3
several 93.2
recreation 89.8
wear 88.2
leader 88.2
five 86.4
many 84.1
woman 79.9
competition 77

Imagga
created on 2022-03-11

crutch 100
staff 100
stick 100
man 25.5
people 21.2
sport 16.5
person 15.9
active 15.3
outdoors 14.9
adult 13.6
walking 13.3
leisure 12.4
senior 12.2
travel 12
old 11.8
male 11.3
outdoor 10.7
happy 10.6
sky 10.2
couple 9.6
walk 9.5
men 9.4
work 9.4
beach 9.3
activity 9
grass 8.7
sunny 8.6
two 8.5
joy 8.3
vacation 8.2
landscape 8.2
lifestyle 7.9
sand 7.9
day 7.8
summer 7.7
road 7.2
working 7.1
happiness 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

outdoor 99
golf 96.9
text 95.2
person 94.4
black and white 81.2
player 75.8
white 64.5
monochrome 53.5
image 33.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 97.9%
Sad 96.9%
Calm 2.2%
Happy 0.5%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.8%
Person 99.7%
Person 99.6%
Person 74.9%
Person 55.8%

Categories

Text analysis

Amazon

VI33A2
٢مع