Human Generated Data

Title

Untitled (golfers with their clubs)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15618

Human Generated Data

Title

Untitled (golfers with their clubs)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Person 99.8
Human 99.8
Transportation 99.4
Bike 99.4
Vehicle 99.4
Bicycle 99.4
Person 99
Person 98.9
Person 98.8
Person 95.1
Clothing 94.9
Apparel 94.9
Sport 94.7
Sports 94.7
Person 94.7
Shoe 92.3
Footwear 92.3
Shoe 89.4
Shorts 88.8
Golf 85
Golf Club 81
Tartan 56.7
Plaid 56.7
Shoe 54

Imagga
created on 2022-03-25

crutch 100
staff 100
stick 98.2
man 28.9
people 23.4
sport 21.4
male 21.3
leisure 19.1
walking 18
adult 16.2
person 16.1
outdoor 15.3
happy 15
one 14.9
active 14.4
recreation 14.3
outdoors 14.2
men 13.7
grass 13.4
golf 13.4
lifestyle 13
vacation 12.3
senior 12.2
standing 11.3
old 11.2
golfer 10.8
course 10.5
fun 10.5
couple 10.5
play 10.3
cold 10.3
outside 10.3
sky 10.2
sports 10.2
playing 10
activity 9.9
attractive 9.8
mountain 9.8
golfing 9.8
landscape 9.7
club 9.4
day 9.4
chair 9.3
smile 9.3
summer 9
snow 8.9
game 8.9
looking 8.8
women 8.7
smiling 8.7
retirement 8.6
winter 8.5
player 8.5
travel 8.5
professional 8.4
health 8.3
alone 8.2
job 8
business 7.9
ball 7.9
work 7.8
pretty 7.7
hiking 7.7
walk 7.6
hobby 7.6
silhouette 7.5
success 7.2
suit 7.2
to 7.1

Google
created on 2022-03-25

Golfer 91.3
Product 90.7
Black 89.6
Golf equipment 88.8
Shorts 88.2
Tree 88.2
Standing 86.4
Black-and-white 86.4
Style 84
Golf club 83.2
Wheel 80.5
Tire 78.9
Golf 78.6
Recreation 77.6
Monochrome 76
Iron 75.8
Monochrome photography 75.6
Vintage clothing 73.8
Rolling 72.2
Grass 71.2

Microsoft
created on 2022-03-25

outdoor 97.9
person 90.9
text 88.2
golf 80.3
footwear 77.6
black and white 60.6
posing 38

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 60-70
Gender Female, 99.9%
Happy 63.5%
Confused 8.3%
Sad 7.7%
Disgusted 7.3%
Calm 6.2%
Fear 3.8%
Angry 2.2%
Surprised 1%

AWS Rekognition

Age 43-51
Gender Female, 99.9%
Disgusted 44.8%
Happy 37.5%
Sad 8.6%
Confused 3.9%
Angry 1.9%
Calm 1.5%
Surprised 1%
Fear 0.8%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Disgusted 53.8%
Sad 23.6%
Calm 7.5%
Confused 6.5%
Angry 4.1%
Happy 2%
Fear 1.3%
Surprised 1.2%

AWS Rekognition

Age 14-22
Gender Male, 99.7%
Calm 95.4%
Surprised 1.9%
Confused 1.2%
Disgusted 0.8%
Angry 0.2%
Sad 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.6%
Calm 56.6%
Confused 22.6%
Disgusted 10.1%
Angry 5.1%
Sad 3.5%
Surprised 1.2%
Happy 0.7%
Fear 0.4%

Microsoft Cognitive Services

Age 45
Gender Female

Microsoft Cognitive Services

Age 59
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Bicycle 99.4%
Shoe 92.3%

Captions

Microsoft

a group of people posing for a photo 88.1%
a group of people posing for the camera 88%
a group of people posing for a picture 87.9%

Text analysis

Amazon

MAGOM

Google

VODYM 2NLEIR EITN *2 ETAGOA
VODYM
2NLEIR
EITN
ETAGOA
*2