Human Generated Data

Title

Untitled (family on tennis court)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17047

Human Generated Data

Title

Untitled (family on tennis court)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Shorts 99.9
Clothing 99.9
Apparel 99.9
Person 99.8
Human 99.8
Person 99.7
Person 99.7
Person 99.5
Person 98.9
Racket 93.3
Tennis Racket 93.3
Tie 91.4
Accessory 91.4
Accessories 91.4
Person 84.3
People 77.5
Tennis Racket 76.9
Outdoors 75.5
Shoe 74
Footwear 74
Shoe 71.3
Face 63.1
Hat 62.3
Nature 58.9
Cap 56.4

Imagga
created on 2022-02-26

musical instrument 83.7
percussion instrument 51.2
accordion 39.7
wind instrument 33.6
marimba 32.7
keyboard instrument 31
man 28.2
people 24.5
vibraphone 22.9
male 21.3
silhouette 20.7
adult 18.8
shopping cart 18.8
person 17.4
lifestyle 17.3
sax 16.9
device 15.5
handcart 15
men 13.7
wheeled vehicle 13.2
smiling 13
day 12.5
friends 12.2
outdoors 11.9
women 11.9
casual 11.9
leisure 11.6
business 11.5
sky 10.8
sunset 10.8
happy 10.6
group 10.5
black 10.2
happiness 10.2
sport 10.2
water 10
outdoor 9.9
fun 9.7
summer 9.6
couple 9.6
building 9.5
sitting 9.4
two 9.3
active 9.3
relaxation 9.2
city 9.1
together 8.8
boy 8.7
container 8.4
portrait 8.4
teen 8.3
holding 8.3
park 8.2
cheerful 8.1
concertina 8
job 8
full length 7.8
outside 7.7
youth 7.7
old 7.7
friendship 7.5
tourism 7.4
musician 7.3
skateboard 7.2
recreation 7.2
smile 7.1
businessman 7.1
travel 7
architecture 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.1
tennis 85.8
man 81.7
clothing 72.8
footwear 57.5
black and white 54.7

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 52.7%
Happy 52.4%
Calm 45.4%
Sad 0.9%
Surprised 0.5%
Confused 0.4%
Disgusted 0.3%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 26-36
Gender Female, 90.5%
Calm 95.9%
Sad 3.4%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 33-41
Gender Male, 96.9%
Calm 63.1%
Happy 34.3%
Confused 0.9%
Sad 0.7%
Surprised 0.4%
Fear 0.3%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 47-53
Gender Male, 97.6%
Calm 99.3%
Sad 0.4%
Happy 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 98.6%
Happy 69.4%
Sad 14.3%
Calm 10.2%
Confused 4.3%
Surprised 0.7%
Disgusted 0.5%
Angry 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Tie 91.4%
Shoe 74%

Captions

Microsoft

a group of people posing for a picture 84.6%
a group of people posing for a photo 78.2%
a group of people posing for the camera 78.1%

Text analysis

Amazon

3
2

Google

MJI7--YT 37A°2--XAC
MJI7--YT
37A°2--XAC