Human Generated Data

Title

Untitled (comedian dancing and jumping)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14710

Human Generated Data

Title

Untitled (comedian dancing and jumping)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.5
Human 99.5
Clothing 91.5
Apparel 91.5
Floor 89.6
Flooring 89
Art 77.5
Drawing 77.5
Text 61.2
Outdoors 60.7
Female 60.4
Girl 60.1
Water 57.4
Child 57.4
Kid 57.4
Play 56.8
Coat 55.7
Overcoat 55.7

Imagga
created on 2022-01-29

musical instrument 60.6
accordion 51.7
keyboard instrument 41.9
wind instrument 31.2
shopping cart 28.9
cart 22.4
shopping 22
wheeled vehicle 20.4
man 18.8
handcart 17.6
business 17
buy 16.9
people 16.2
male 15.6
shop 15.3
menorah 15.1
metal 13.7
container 13.5
market 13.3
drawing 13.3
architecture 13.3
supermarket 12.8
sketch 12.7
sale 12
basket 12
travel 12
trolley 11.8
percussion instrument 11.5
retail 11.4
candelabrum 11.3
balcony 11.1
money 11.1
transportation 10.8
vibraphone 10.5
urban 10.5
adult 10.4
store 10.4
empty 10.3
men 10.3
lifestyle 10.1
structure 9.8
interior 9.7
sport 9.5
holiday 9.3
speed 9.2
transport 9.1
office 8.8
water 8.7
skateboard 8.6
construction 8.6
wall 8.5
push 8.5
person 8.5
finance 8.4
candlestick 8.4
house 8.4
metallic 8.3
work 8.2
vacation 8.2
equipment 8.1
building 7.9
conveyance 7.9
women 7.9
design 7.9
day 7.8
couple 7.8
black 7.8
consumer 7.8
modern 7.7
purchase 7.7
sky 7.7
trade 7.6
walk 7.6
human 7.5
city 7.5
symbol 7.4
board 7.2
active 7.2
activity 7.2
businessman 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 96.8
drawing 86.4
old 82.6
cartoon 81.6
sketch 67.4
person 64.6

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Sad 59.9%
Happy 16.6%
Calm 12.4%
Surprised 5.5%
Fear 2.9%
Confused 1.4%
Angry 0.7%
Disgusted 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a person 78.5%
a vintage photo of a person 77.7%
an old photo of a person 77.6%

Text analysis

Amazon

e
MJIR
MJIR YT33AS
YT33AS

Google

MJIR YT3RA
MJIR
YT3RA