Human Generated Data

Title

Untitled (three young women sitting on a bench, more benches and people in background)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14356

Human Generated Data

Title

Untitled (three young women sitting on a bench, more benches and people in background)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14356

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 100
Apparel 100
Person 99.6
Human 99.6
Person 99.3
Person 98
Robe 95.8
Fashion 95.8
Person 95.2
Gown 93.6
Wedding 92.4
Female 90.8
Suit 89.1
Overcoat 89.1
Coat 89.1
Bridegroom 86.1
Person 85.2
Person 84.9
Wedding Gown 82.7
Bride 81.6
Dress 80.4
Nature 77.9
Woman 76.5
Outdoors 74.6
Helmet 72.9
Person 70
Furniture 69.2
Portrait 67.1
Photography 67.1
Face 67.1
Photo 67.1
Chair 64.7
Girl 59.8
People 59.2
Shorts 58.6
Leisure Activities 55.5
Tuxedo 55.3
Person 54

Clarifai
created on 2023-10-27

people 99.9
adult 98.6
man 98.4
group 97.6
woman 95.7
group together 95.1
wear 92.1
actor 90
wedding 88.6
recreation 88.5
two 87.4
several 86.5
leader 86
three 83.8
many 83.8
actress 82.7
veil 82.5
administration 78.4
interaction 78.2
chair 76.1

Imagga
created on 2022-01-29

people 32.9
person 26.9
adult 26.3
man 25.1
snow 22.3
male 21.3
active 20.5
winter 17.9
sport 17.8
lifestyle 17.4
beach 16
exercise 14.5
cold 13.8
men 13.7
fashion 13.6
walking 13.3
happy 13.2
clothing 13.1
vacation 13.1
portrait 12.9
black 12.6
leisure 12.5
leg 12.4
outdoor 12.2
outdoors 11.9
fitness 11.7
activity 11.6
crutch 11.6
fun 11.2
pretty 11.2
attractive 11.2
women 11.1
relax 11
business 10.9
body 10.4
sitting 10.3
sand 10.3
action 10.2
staff 9.9
summer 9.6
happiness 9.4
sea 9.4
casual 9.3
smile 9.3
travel 9.2
city 9.1
dress 9
lady 8.9
professional 8.9
urban 8.7
holiday 8.6
motion 8.6
outside 8.6
model 8.6
wall 8.6
stick 8.4
street 8.3
silhouette 8.3
suit 8.2
pose 8.2
office 8
smiling 8
dancer 7.9
hair 7.9
couple 7.8
elegant 7.7
ski 7.6
legs 7.6
enjoy 7.5
human 7.5
ocean 7.5
garment 7.4
water 7.3
alone 7.3
recreation 7.2
athlete 7.2
posing 7.1
businessman 7.1
day 7.1
look 7

Microsoft
created on 2022-01-29

text 98.6
clothing 86.9
person 83.3
footwear 72.2
dance 72.2
black and white 59.1
posing 50.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 98.2%
Surprised 98.5%
Calm 0.5%
Sad 0.3%
Happy 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Female, 55.1%
Calm 46.2%
Sad 31.3%
Happy 10.6%
Surprised 6.4%
Confused 2%
Fear 1.3%
Angry 1%
Disgusted 1%

AWS Rekognition

Age 53-61
Gender Male, 90.2%
Calm 72.8%
Sad 21.7%
Confused 2.3%
Happy 1%
Surprised 0.7%
Disgusted 0.6%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 90.6%
Calm 84.9%
Sad 13.5%
Confused 0.6%
Angry 0.4%
Surprised 0.3%
Disgusted 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 65.5%
Calm 98.7%
Sad 0.9%
Fear 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0%
Confused 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 99.6%
Person 99.3%
Person 98%
Person 95.2%
Person 85.2%
Person 84.9%
Person 70%
Person 54%
Helmet 72.9%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

4
MJI7
MJI7 YT3RAS ACSHA
YT3RAS
ACSHA

Google

MJ17 YT3RA2 A
MJ17
YT3RA2
A