Human Generated Data

Title

Untitled (portrait of eleven female grandchildren seated on and around couch)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2986

Human Generated Data

Title

Untitled (portrait of eleven female grandchildren seated on and around couch)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Dress 99.7
Clothing 99.7
Apparel 99.7
Person 98.5
Human 98.5
Person 98.1
Person 97.8
Person 97.7
Female 96.2
Person 96
Person 92.4
Person 92.3
Person 90.1
Stage 81.9
Woman 81
Person 80.5
Helmet 80.3
Girl 79.9
Skirt 79.6
Costume 78.7
Face 78.3
Person 78.2
Leisure Activities 77.1
Kid 75.6
Child 75.6
Icing 73.4
Food 73.4
Dessert 73.4
Cake 73.4
Cream 73.4
Creme 73.4
Performer 71.6
Dance Pose 68.4
People 65.9
Shorts 65.4
Photography 64.2
Photo 64.2
Portrait 64.1
Indoors 60.4
Helmet 57.5
Floor 55.6
Chair 55.2
Furniture 55.2
Person 47.2

Imagga
created on 2022-01-21

brass 46.7
wind instrument 35.8
musical instrument 24.7
people 22.3
football helmet 21.8
helmet 19.5
person 19.2
male 19.1
man 17.5
adult 16.3
sky 15.9
silhouette 15.7
headdress 14.9
dancer 13.4
sport 13.3
black 13.2
art 12.9
water 12.7
men 12
body 12
dance 11.7
lifestyle 11.6
travel 11.3
human 11.2
clothing 11.1
fun 10.5
summer 10.3
exercise 10
performer 9.8
blackboard 9.7
clouds 9.3
reflection 9.2
world 9.2
hand 9.1
modern 9.1
design 9
equipment 8.9
style 8.9
cool 8.9
model 8.5
dark 8.3
ocean 8.3
fashion 8.3
fitness 8.1
active 8.1
sunset 8.1
team 8.1
group 8.1
player 8
backboard 7.8
boy 7.8
wave 7.8
cloud 7.7
party 7.7
grunge 7.7
outdoor 7.6
athlete 7.6
landscape 7.4
event 7.4
graphic 7.3
teenager 7.3
pose 7.2
color 7.2
recreation 7.2
park 7.1
portrait 7.1
women 7.1
night 7.1
ball 7
together 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

text 98.5
posing 89.7
person 84.3
dance 84.2
clothing 77.9
old 72.3
drawing 51.3
vintage 29.9
picture frame 6.3

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.4%
Calm 95.2%
Sad 1.6%
Angry 0.8%
Happy 0.7%
Disgusted 0.5%
Confused 0.4%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Male, 65.9%
Happy 65%
Surprised 27.5%
Calm 2.3%
Fear 1.9%
Angry 0.9%
Confused 0.8%
Disgusted 0.8%
Sad 0.7%

AWS Rekognition

Age 38-46
Gender Male, 85.4%
Happy 83.4%
Calm 5.2%
Surprised 3.5%
Sad 3.5%
Confused 1.5%
Angry 1.1%
Fear 1%
Disgusted 0.8%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 93.3%
Surprised 4.8%
Sad 0.5%
Angry 0.4%
Fear 0.3%
Happy 0.3%
Disgusted 0.2%
Confused 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 60.9%
Happy 26.5%
Sad 5.6%
Confused 2%
Surprised 1.5%
Disgusted 1.3%
Angry 1.2%
Fear 0.9%

AWS Rekognition

Age 30-40
Gender Female, 71.3%
Surprised 88.2%
Calm 4.5%
Happy 3.6%
Fear 1.8%
Angry 0.7%
Sad 0.6%
Disgusted 0.4%
Confused 0.2%

AWS Rekognition

Age 33-41
Gender Female, 93%
Calm 51.4%
Happy 33.3%
Surprised 11.7%
Sad 1.2%
Angry 0.7%
Disgusted 0.7%
Confused 0.6%
Fear 0.3%

AWS Rekognition

Age 47-53
Gender Male, 97.6%
Calm 76.4%
Surprised 17.9%
Happy 2.7%
Fear 1.1%
Angry 0.6%
Disgusted 0.5%
Sad 0.5%
Confused 0.4%

AWS Rekognition

Age 2-10
Gender Female, 89.8%
Sad 71.9%
Surprised 22.6%
Calm 1.4%
Disgusted 1%
Fear 0.9%
Confused 0.8%
Angry 0.8%
Happy 0.7%

AWS Rekognition

Age 45-53
Gender Male, 55.2%
Calm 69.5%
Surprised 26.3%
Sad 0.9%
Angry 0.9%
Happy 0.9%
Fear 0.8%
Disgusted 0.6%
Confused 0.2%

AWS Rekognition

Age 35-43
Gender Female, 55%
Sad 50.1%
Calm 24.1%
Surprised 11.4%
Happy 6.5%
Confused 2.4%
Fear 2.4%
Angry 2.2%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Helmet 80.3%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 89.4%
a vintage photo of a group of people posing for a picture 89.3%
a group of people posing for a photo 89.2%

Text analysis

Amazon

RODAR--2.VEETA--EW