Human Generated Data

Title

Untitled (men looking into store window displaying vacuum contraption)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14423

Human Generated Data

Title

Untitled (men looking into store window displaying vacuum contraption)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14423

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.8
Human 99.8
Person 99.5
Person 98.8
Person 97.7
Clothing 97.1
Apparel 97.1
Female 67.3
Appliance 66.9
Tie 66.6
Accessories 66.6
Accessory 66.6
Shorts 63.5
Steamer 55.5

Clarifai
created on 2023-10-27

people 99.8
adult 99.3
group together 98.5
man 97.8
monochrome 95.7
wear 95.1
group 94.9
vehicle 94.4
two 94.1
uniform 92.6
three 92
administration 90.4
several 88.8
outfit 88.3
four 85.3
woman 84.9
watercraft 81.7
medical practitioner 80.8
lid 80.4
facial expression 79.4

Imagga
created on 2022-01-29

medical 28.2
man 26.2
male 25.5
person 23.5
nurse 22.4
people 21.2
professional 20.9
brass 20.2
patient 20
doctor 19.7
medicine 19.4
hospital 18.2
room 17.7
work 17.3
worker 17.2
home 16.7
health 16.7
wind instrument 15.7
device 15.4
working 15
men 14.6
adult 14.6
indoors 14
occupation 13.7
laboratory 13.5
business 13.4
stethoscope 13.3
research 12.4
instrument 12.1
office 12
kitchen 11.8
scientist 11.7
lab 11.7
chemistry 11.6
science 11.6
holding 11.5
interior 11.5
lab coat 11.3
technology 11.1
musical instrument 11
businessman 10.6
profession 10.5
coat 10.5
student 10
modern 9.8
beaker 9.7
job 9.7
clinic 9.7
chemical 9.7
exam 9.6
illness 9.5
smiling 9.4
inside 9.2
team 9
computer 8.8
looking 8.8
lifestyle 8.7
test 8.7
education 8.7
table 8.6
biology 8.5
house 8.4
hand 8.3
care 8.2
waiter 8.2
equipment 7.8
experiment 7.8
assistant 7.8
cornet 7.6
employee 7.6
happy 7.5
clean 7.5
human 7.5
study 7.5
glasses 7.4
indoor 7.3
information 7.1
to 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.5
outdoor 93.3
clothing 82.6
person 76.1
man 72.1
posing 63.3
musical instrument 63

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99%
Calm 92.2%
Surprised 3.8%
Happy 1.1%
Sad 1%
Angry 0.6%
Confused 0.5%
Disgusted 0.4%
Fear 0.4%

AWS Rekognition

Age 28-38
Gender Male, 99.4%
Calm 99.3%
Sad 0.3%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99.8%
Person 99.5%
Person 98.8%
Person 97.7%
Tie 66.6%

Categories

Text analysis

Amazon

FILTER
REMINGTON
MJI7
FILTER QUIL
QUIL
rug
rug ARTIR
ARTIR

Google

MJI7 YT3RA2 REMINGTON FILTER QUI
MJI7
YT3RA2
REMINGTON
FILTER
QUI