Human Generated Data

Title

Untitled (children in costumes auditioning for theater, man and girls)

Date

c. 1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15695.2

Human Generated Data

Title

Untitled (children in costumes auditioning for theater, man and girls)

People

Artist: Jack Gould, American

Date

c. 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15695.2

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.1
Human 98.1
Clothing 98
Apparel 98
Person 97.7
Person 96.8
Person 96.4
Person 94.5
Female 88.7
Face 86.3
People 84.5
Person 78.5
Suit 75.9
Overcoat 75.9
Coat 75.9
Girl 71.8
Woman 71.5
Plant 69.3
Portrait 68.6
Photography 68.6
Photo 68.6
Door 67.1
Kid 66.3
Child 66.3
Text 65.3
Shorts 65
Furniture 60.3
Room 58.2
Indoors 58.2
Sitting 55.6
Floor 55.2

Clarifai
created on 2023-10-29

people 99.8
group 98.5
man 97.7
adult 97.6
leader 97.6
monochrome 95.6
administration 95.1
interaction 92.7
two 91.2
chair 87.5
actor 87.5
outfit 87.3
home 86.5
woman 85.4
family 85.4
wear 84.3
wedding 84.2
three 82.3
elderly 81.5
nostalgia 81.1

Imagga
created on 2022-02-05

man 34.3
male 28.5
people 28.4
newspaper 27.5
person 26.1
adult 25.8
businessman 24.7
groom 23.2
professional 21.9
product 21.3
office 20.4
business 19.4
couple 19.2
indoors 18.4
happy 17.5
corporate 17.2
portrait 16.8
job 16.8
smiling 16.6
creation 16.6
home 16
women 15.8
executive 15.4
men 14.6
two 14.4
employee 14.3
meeting 14.1
occupation 13.7
indoor 13.7
room 13.6
work 13.4
happiness 13.3
modern 13.3
businesspeople 13.3
worker 12.7
family 12.5
career 12.3
lifestyle 12.3
mature 12.1
looking 12
color 11.7
team 11.6
holding 11.6
senior 11.3
teacher 11.2
clothing 11.2
sitting 11.2
casual 11
communication 10.9
businesswoman 10.9
window 10.8
smile 10.7
interior 10.6
cheerful 10.6
bride 10.5
together 10.5
group 10.5
old 10.5
life 10.2
dress 9.9
care 9.9
talking 9.5
desk 9.4
manager 9.3
building 9
suit 9
new 8.9
computer 8.8
love 8.7
face 8.5
pretty 8.4
attractive 8.4
house 8.4
teamwork 8.3
inside 8.3
laptop 8.2
waiter 8.1
handsome 8
medical 7.9
mother 7.9
standing 7.8
educator 7.6
formal 7.6
phone 7.4
wedding 7.4
alone 7.3
confident 7.3
black 7.2
grandma 7.2
working 7.1
child 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 99.4
outdoor 95.6
text 94.4
standing 86
clothing 85.9
dress 78
woman 77
drawing 76.9
posing 75.1
old 60.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 85%
Happy 66.3%
Calm 22.6%
Sad 5.2%
Surprised 3.7%
Angry 0.9%
Disgusted 0.5%
Fear 0.5%
Confused 0.2%

AWS Rekognition

Age 23-33
Gender Female, 93.6%
Calm 96.2%
Sad 2.7%
Happy 0.7%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.1%
Person 97.7%
Person 96.8%
Person 96.4%
Person 94.5%
Person 78.5%

Categories

Imagga

paintings art 99.7%