Human Generated Data

Title

Untitled (line of boys learning to dance)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15474

Human Generated Data

Title

Untitled (line of boys learning to dance)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Person 99.8
Person 99.7
Person 99.6
Clothing 98.3
Apparel 98.3
Shoe 97.8
Footwear 97.8
Person 96.6
Shoe 96.6
Shoe 92.7
Sleeve 90.5
Person 89.1
Flooring 86.6
Floor 86.2
Long Sleeve 81.7
Bridge 72.3
Boardwalk 72.3
Building 72.3
Shoe 70.5
Shoe 69.1
Tie 68.7
Accessories 68.7
Accessory 68.7
Pants 65.3
Photography 65
Photo 65
Portrait 63.6
Face 63.6
Drawing 62.2
Art 62.2
Shorts 58.6
Wood 58
Path 55.9
Shoe 55.3

Imagga
created on 2022-03-05

people 36.8
negative 33.3
man 30.9
adult 29.4
male 27.7
film 27.5
person 26.4
women 24.5
business 24.3
group 23.4
bathing cap 21.1
photographic paper 20.5
portrait 20.1
couple 20
men 19.8
businessman 19.4
team 17.9
businesswoman 17.3
walking 17
cap 16.6
clothing 16.5
happy 16.3
smiling 15.9
indoors 15.8
standing 15.6
corporate 15.5
lifestyle 15.2
human 15
happiness 14.9
professional 14.6
photographic equipment 13.7
smile 13.5
nurse 13.5
love 13.4
life 13.4
day 13.3
headdress 12.9
casual 12.7
crowd 12.5
city 12.5
family 12.5
urban 11.4
fashion 11.3
dress 10.8
interior 10.6
motion 10.3
worker 10.2
two 10.2
face 9.9
suit 9.9
job 9.7
black 9.6
blurred 9.6
work 9.4
company 9.3
teamwork 9.3
inside 9.2
20s 9.2
modern 9.1
pretty 9.1
activity 9
building 9
office 8.8
child 8.7
full length 8.7
businesspeople 8.5
adults 8.5
attractive 8.4
old 8.4
window 8.3
health 8.3
occupation 8.2
outdoors 8.2
vacation 8.2
working 8
together 7.9
holiday 7.9
travel 7.7
silhouette 7.4
mature 7.4
holding 7.4
successful 7.3
looking 7.2
body 7.2
home 7.2
bride 7.2
handsome 7.1
staff 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

footwear 98.3
text 98
clothing 93.6
person 89.5
posing 38.8

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 99.7%
Sad 76.6%
Calm 7.3%
Angry 6.3%
Disgusted 5.6%
Surprised 1.6%
Confused 1.3%
Happy 0.9%
Fear 0.5%

AWS Rekognition

Age 35-43
Gender Female, 97.5%
Calm 72.9%
Happy 23.1%
Sad 2.3%
Surprised 0.6%
Angry 0.4%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Female, 86.7%
Calm 92.2%
Sad 3.8%
Surprised 1.4%
Confused 0.7%
Angry 0.6%
Disgusted 0.6%
Happy 0.3%
Fear 0.3%

AWS Rekognition

Age 23-33
Gender Female, 83.6%
Calm 75.5%
Sad 22.8%
Happy 0.6%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 97.8%
Tie 68.7%

Captions

Microsoft

a group of people standing next to a window 86.7%
a group of people standing in front of a window 85.4%
a group of people posing for a photo 84.1%

Text analysis

Amazon

GALLANT

Google

GALLANT
GALLANT