Human Generated Data

Title

Untitled (boy and girl bow and curtsey)

Date

1953, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.127

Human Generated Data

Title

Untitled (boy and girl bow and curtsey)

People

Artist: Jack Gould, American

Date

1953, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.127

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Person 99.2
Human 99.2
Person 98.6
Person 90.9
Footwear 83
Shoe 83
Suit 80.2
Overcoat 80.2
Coat 80.2
Floor 76.8
Flooring 69.2
Door 65.8
Sleeve 65.6
Shoe 63
Photography 60.1
Photo 60.1
Shorts 55.1

Clarifai
created on 2023-10-25

people 99.9
group 99.1
man 98.3
three 97
group together 96.8
woman 96.7
adult 95.4
two 95.2
portrait 92.8
family 92.4
collage 90.8
room 88.8
four 88.7
actor 84.8
leader 80
child 80
chair 79.2
education 77.4
administration 75.9
several 75.8

Imagga
created on 2021-12-14

man 30.9
people 25.1
businessman 24.7
male 24.1
person 23.8
business 23.1
office 20.3
black 19.9
professional 18.9
executive 18.7
suit 17.9
corporate 17.2
adult 16.7
standing 16.5
meeting 14.1
groom 13.8
success 13.7
fashion 13.6
happy 13.1
couple 13.1
manager 13
group 12.9
men 12.9
team 12.5
job 12.4
interior 12.4
portrait 12.3
smile 12.1
work 11.8
dress 11.7
old 10.4
building 10.4
businesswoman 10
clothing 10
call 10
kin 9.9
pretty 9.8
attractive 9.8
boss 9.6
room 9.5
women 9.5
tie 9.5
elegant 9.4
two 9.3
communication 9.2
successful 9.1
style 8.9
photograph 8.8
model 8.5
career 8.5
window 8.5
elegance 8.4
teamwork 8.3
indoor 8.2
family 8
lifestyle 7.9
indoors 7.9
sitting 7.7
youth 7.7
desk 7.6
one 7.5
vintage 7.4
holding 7.4
lady 7.3
jacket 7.3
handsome 7.1
to 7.1
working 7.1
day 7.1
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

clothing 98.8
wall 97.3
person 93.1
text 90.2
footwear 88.8
man 75.9
gallery 66.4
woman 59.3
posing 49.3
picture frame 10.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-26
Gender Female, 95.4%
Calm 97.8%
Sad 1%
Happy 0.4%
Confused 0.3%
Surprised 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 13-23
Gender Female, 60.5%
Calm 69.4%
Sad 26.2%
Confused 1.4%
Angry 1.3%
Fear 1%
Surprised 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 14-26
Gender Female, 95.2%
Calm 94%
Confused 2.8%
Surprised 1.4%
Sad 0.9%
Angry 0.5%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 83%
Suit 80.2%

Categories

Imagga

paintings art 86.9%
people portraits 11.2%