Human Generated Data

Title

Untitled (several people including man on bike looking into shop window at mysterious vacuum display)

Date

c .1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14683

Human Generated Data

Title

Untitled (several people including man on bike looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

c .1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.8
Human 99.8
Machine 99.5
Wheel 99.5
Vehicle 99.5
Bicycle 99.5
Transportation 99.5
Bike 99.5
Person 99.2
Person 98.7
Clothing 83.2
Apparel 83.2
Person 77.5
Boat 73.1
Person 72.9
Coat 65.2
Spoke 64.3
Appliance 58.6
Text 57.5

Imagga
created on 2022-01-29

iron lung 47.1
respirator 37.7
breathing device 29.6
sketch 27.5
device 26.4
drawing 22.6
home 19.1
people 17.8
room 16.9
nurse 16.8
work 16.5
representation 16.5
man 16.1
interior 15.9
technology 15.6
working 15
indoors 14.9
equipment 14.6
business 14.6
male 14.2
modern 14
house 13.4
adult 12.3
glass 12
hospital 11
medical 10.6
medicine 10.6
person 10.4
doctor 9.4
clean 9.2
kitchen 9.1
furniture 9
design 9
patient 8.9
light 8.7
architecture 8.6
network 8.5
table 8.3
glasses 8.3
window 8.2
lifestyle 7.9
building 7.8
men 7.7
health 7.6
research 7.6
communication 7.6
connection 7.3
computer 7.3
data 7.3
metal 7.2
information 7.1
job 7.1
decor 7.1
server 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.4
bicycle 92.8
wheel 89.3
land vehicle 75.4
vehicle 74.1
bicycle wheel 69.1
person 59.8
drawing 58.6

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 57.4%
Sad 54.8%
Calm 21.5%
Fear 6.3%
Happy 4.8%
Angry 4.4%
Surprised 3.5%
Disgusted 3.4%
Confused 1.2%

AWS Rekognition

Age 18-24
Gender Male, 95.6%
Calm 72.6%
Confused 10.4%
Sad 5.6%
Disgusted 3.2%
Angry 2.9%
Surprised 2.3%
Happy 1.9%
Fear 1.2%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 89.9%
Happy 7.7%
Confused 0.6%
Surprised 0.6%
Disgusted 0.5%
Sad 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 98.3%
Calm 95.9%
Happy 2.2%
Surprised 0.7%
Sad 0.6%
Fear 0.2%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Wheel 99.5%
Bicycle 99.5%
Boat 73.1%

Captions

Microsoft

diagram 42.2%

Text analysis

Amazon

PEOP
LIQUORS
Stores
QUEEN
REMINGTON
FILTER
FILTER QUEEN )
MJI7
ocono LIQUORS
)
MJI7 YTARASY
NAMAJOD
YTARASY
St. Louis!
UNEEM
ocono

Google

REMINGTOn
St.B
MJ17
PEOP
Stores
MJ17 YTARA2 REMINGTOn PEOP IQUORS Stores St.B FILTER QUEEN
FILTER
QUEEN
YTARA2
IQUORS