Human Generated Data

Title

Untitled (family holding sporting equipment by lakeshore)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17077

Human Generated Data

Title

Untitled (family holding sporting equipment by lakeshore)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 99.4
Person 99.2
Person 99
Person 98.9
Dog 94.7
Mammal 94.7
Pet 94.7
Animal 94.7
Canine 94.7
Stage 90.9
Leisure Activities 90.6
Clothing 85.5
Apparel 85.5
Shorts 82.4
Dance Pose 78.5
Performer 72.8
Racket 72.4
Tennis Racket 72.4
Grass 67.5
Plant 67.5
Musician 66.8
Musical Instrument 66.8
Portrait 64.7
Photography 64.7
Photo 64.7
Face 64.7
People 61.5
Sport 55.5
Sports 55.5
Cricket 55.5

Imagga
created on 2022-02-26

ballplayer 78.1
player 73
athlete 69.4
contestant 51.6
brass 48.7
bugle 38.8
wind instrument 37.3
person 36.6
sport 31.7
active 28.9
sky 27.4
people 26.2
man 25.5
fun 25.5
musical instrument 25
male 23.4
grass 22.9
outdoor 22.2
lifestyle 21.7
adult 21.6
happy 20.7
summer 19.9
exercise 18.2
joy 17.5
play 17.2
fitness 17.2
freedom 16.5
happiness 16.5
leisure 15.8
action 15.8
outdoors 15.7
golfer 15.5
boy 14.8
youth 14.5
sword 14
sunset 13.5
jump 13.4
playing 12.8
healthy 12.6
field 12.6
silhouette 12.4
ball 12.2
teenager 11.9
health 11.8
beach 11.8
spring 11.8
recreation 11.7
activity 11.6
wicket 11.5
weapon 11.5
cricket equipment 11.4
cornet 11.3
sun 11.3
body 11.2
outside 11.1
competition 11
day 11
child 11
energy 10.9
human 10.5
sports equipment 10.3
fly 10.3
clouds 10.1
vacation 9.8
game 9.8
family 9.8
device 9.7
sunny 9.5
relax 9.3
sports 9.2
children 9.1
meadow 9
together 8.8
sand 8.7
run 8.7
running 8.6
motion 8.6
vitality 8.5
club 8.5
casual 8.5
holiday 7.9
couple 7.8
men 7.7
attractive 7.7
equipment 7.7
athletic 7.7
golf 7.6
hand 7.6
relaxation 7.5
movement 7.5
free 7.5
ocean 7.5
park 7.4
guy 7.4
girls 7.3
black 7.2
love 7.1
sea 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.8
outdoor 91.5
person 81.2
man 69.7
old 64.6
white 61.3
black and white 55.2
posing 50.3
vintage 30.2

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 91.2%
Sad 67.6%
Calm 10.1%
Confused 6.2%
Fear 6.1%
Disgusted 5%
Happy 2.2%
Surprised 1.5%
Angry 1.2%

AWS Rekognition

Age 23-31
Gender Female, 99.7%
Calm 94.8%
Sad 3.4%
Happy 1.4%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 94.7%

Captions

Microsoft

a vintage photo of a group of people posing for a picture 94%
a vintage photo of a group of people posing for the camera 92.8%
a group of people posing for a photo 92.7%

Text analysis

Amazon

3

Google

3
3