Human Generated Data

Title

Untitled (butcher holding meat and selling meat to customers)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14377

Human Generated Data

Title

Untitled (butcher holding meat and selling meat to customers)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 99.2
Person 99.2
Person 98.4
Person 98.2
Clothing 95.6
Hat 95.6
Apparel 95.6
Helmet 80.5
Chef 73.5
Accessory 72.3
Accessories 72.3
Sunglasses 72.3
Person 64.3
Hat 63.8
Worker 60.2
Bakery 58.4
Shop 58.4
Cafeteria 55.6
Restaurant 55.6
Sitting 55.2
Hat 54.3

Imagga
created on 2022-01-29

laptop 42.9
computer 37.9
man 36.3
work 28.4
male 28.4
working 27.4
newspaper 27
business 26.7
engineer 25.5
person 25.2
people 25.1
office 21.6
product 20.8
device 20.2
technology 20
job 19.5
adult 19
worker 18.7
businessman 18.5
creation 17.1
smiling 15.2
happy 15
notebook 14.7
building 14.3
professional 13.1
lifestyle 13
occupation 12.8
businesswoman 12.7
casual 12.7
student 12.7
women 12.6
builder 12.6
communication 12.6
using 12.5
men 12
sitting 12
room 11.7
holding 11.5
cheerful 11.4
outdoors 11.2
smile 10.7
couple 10.4
day 10.2
indoor 10
attractive 9.8
table 9.7
indoors 9.7
looking 9.6
wireless 9.5
corporate 9.4
construction 9.4
executive 9.4
finance 9.3
suit 9
handsome 8.9
color 8.9
desk 8.7
education 8.7
studying 8.6
equipment 8.6
solar dish 8.6
monitor 8.5
industry 8.5
face 8.5
one person 8.5
two 8.5
senior 8.4
portrait 8.4
modern 8.4
old 8.4
musical instrument 8.3
school 8.3
screen 8.1
group 8.1
together 7.9
standing 7.8
career 7.6
sit 7.6
horizontal 7.5
keyboard 7.5
interior 7.1
happiness 7
solar array 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.4
person 96.5
clothing 85.5
man 85.1

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 99.5%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0%
Angry 0%

AWS Rekognition

Age 30-40
Gender Male, 91.4%
Calm 88.5%
Surprised 6.5%
Confused 1.3%
Sad 1.1%
Disgusted 0.9%
Happy 0.9%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 36-44
Gender Male, 66.1%
Calm 89.8%
Sad 4%
Surprised 2.9%
Happy 1.4%
Angry 0.6%
Disgusted 0.5%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 42-50
Gender Male, 99.7%
Calm 66.8%
Sad 20.9%
Confused 6.7%
Happy 2.9%
Surprised 0.9%
Angry 0.8%
Disgusted 0.7%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Male, 99.6%
Happy 57.1%
Calm 36.2%
Surprised 2.4%
Disgusted 1.6%
Sad 1%
Confused 0.8%
Angry 0.6%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Hat 95.6%
Helmet 80.5%
Sunglasses 72.3%

Captions

Microsoft

a man standing in front of a window 65.8%
a group of people standing in front of a window 65.7%
an old photo of a man 65.6%

Text analysis

Amazon

MARKET
MARKET MASTER'S
MASTER'S
HARRY
STAIRS)
(UP STAIRS)
(UP
-OFFICE
HARRY C.MEYERS
C.MEYERS
MARKET M
Genera
M
St. Lo

Google

MARKET
(UP
MASTERS
St.
Le
enera
STAIRS)
HARRY
M
MARKET MASTERS -OFFICE (UP STAIRS) HARRY C.MEYERS MARKET M enera St. Le
-OFFICE
C.MEYERS