Human Generated Data

Title

Untitled (men's rugby team)

Date

c. 1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1997

Human Generated Data

Title

Untitled (men's rugby team)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.3
Person 98.8
Person 98.7
Person 98.4
Person 97.6
Clothing 96.8
Apparel 96.8
Person 96.8
Person 96.8
Person 96
Person 92.3
Person 91
Shorts 90.8
Person 75
Grass 73.1
Plant 73.1
People 71.9
Person 69.1
Sailor Suit 63.3
Floor 63.1
Face 62.5
Photography 62.5
Photo 62.5
Portrait 62.5
Kid 59.1
Child 59.1
Dress 58.9
Standing 57.4
Hand 57.4
Sport 56.8
Sports 56.8
Croquet 56.8

Imagga
created on 2021-12-14

negative 97.4
film 75.8
photographic paper 56.3
photographic equipment 37.5
picket fence 32
landscape 26
fence 25.1
snow 20.3
structure 19.7
barrier 19.1
forest 15.7
park 15.6
grunge 15.3
black 15
water 14.7
winter 14.5
old 13.9
outdoors 13.4
sky 13.4
travel 13.4
art 13.3
tree 13.1
sun 12.9
obstruction 12.8
paint 12.7
river 12.5
outdoor 12.2
summer 12.2
white 12
weather 11.9
dark 11.7
vintage 11.6
trees 11.6
rural 11.5
cold 11.2
texture 11.1
stone 11
beach 11
silhouette 10.8
fountain 10.7
ice 10.7
natural 10.7
rock 10.4
scene 10.4
pattern 10.3
scenery 9.9
environment 9.9
mountain 9.8
fog 9.6
drawing 9.6
lake 9.3
tourism 9.1
morning 9
dirty 9
coast 9
history 8.9
antique 8.7
sea 8.6
season 8.6
grungy 8.5
space 8.5
design 8.4
decoration 8.4
vacation 8.2
fall 8.1
light 8
cool 8
scenic 7.9
holiday 7.9
foggy 7.9
wild 7.8
mist 7.7
frame 7.5
ocean 7.5
man 7.4
ecology 7.3
effect 7.3
border 7.2
grass 7.1
architecture 7
country 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97
old 95.9
person 82.7
posing 82.6
black 82.1
white 77.9
gallery 75.3
room 53.9
black and white 50.9
picture frame 45.2
image 39.1
painted 31.2
team 25.1
painting 15.4

Face analysis

Amazon

Google

AWS Rekognition

Age 50-68
Gender Male, 93.2%
Calm 95.5%
Sad 2.1%
Happy 1.1%
Confused 0.6%
Surprised 0.3%
Angry 0.3%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 36-52
Gender Male, 92.2%
Calm 96.5%
Sad 1.5%
Angry 1.1%
Happy 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 42-60
Gender Male, 82.2%
Calm 62.2%
Happy 31%
Angry 2.7%
Sad 2.3%
Disgusted 0.8%
Confused 0.4%
Surprised 0.4%
Fear 0.1%

AWS Rekognition

Age 23-37
Gender Male, 59.8%
Calm 88.5%
Sad 6.5%
Happy 2%
Confused 1.5%
Angry 0.6%
Surprised 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 49-67
Gender Male, 75.1%
Calm 94.3%
Happy 4.6%
Sad 0.6%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 31-47
Gender Female, 52.5%
Calm 53.9%
Happy 42.3%
Sad 1.2%
Surprised 0.9%
Angry 0.7%
Confused 0.6%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 52-70
Gender Female, 56%
Calm 92.7%
Sad 2.7%
Angry 1.5%
Happy 1.3%
Confused 0.8%
Fear 0.4%
Surprised 0.4%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Male, 84.7%
Calm 92.4%
Sad 4.3%
Happy 1.9%
Angry 0.5%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 76.8%
a vintage photo of a man 76.7%
a vintage photo of a group of people posing for a picture 76.6%