Human Generated Data

Title

Untitled (probably Alabama)

Date

1930s

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3116

Human Generated Data

Title

Untitled (probably Alabama)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3116

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Workshop 99
Human 95.9
Person 95.9
Interior Design 95.3
Indoors 95.3
Person 94.8
Person 90.3
Building 88.7
Factory 83.9
Room 83.5
Furniture 79.8
Lab 75.5
Wood 68.2
Table 67.8
Restaurant 66.5
Person 63
Housing 58.2
Classroom 58.1
School 58.1
Assembly Line 57.4
Plywood 55.1

Clarifai
created on 2023-10-15

people 99.9
furniture 99
school 98.7
adult 98.4
desk 98
education 98
room 97.3
teacher 96.4
group 95.6
classroom 94.6
chair 94.6
seat 94.1
child 89.5
monochrome 89.1
sit 87.7
group together 87.6
woman 86.6
employee 86.5
one 82.9
man 81.5

Imagga
created on 2021-12-15

room 35.4
interior 28.3
chair 26.9
building 26.1
architecture 25.9
classroom 23.6
modern 23.1
industry 20.5
structure 19.7
house 18.4
industrial 17.2
steel 17.2
urban 16.6
floor 15.8
city 15.8
inside 15.6
table 15.3
balcony 15.2
power 15.1
window 14.8
factory 14.2
furniture 14.1
device 13.8
equipment 13.5
glass 13.2
indoors 13.2
plant 13.1
station 12.8
business 12.8
wood 12.5
hall 12.1
metal 12.1
light 12
seat 11.8
sky 11.5
wall 11.2
office 11.2
door 11.1
shop 10.6
engineering 10.5
home 10.4
chairs 9.8
decor 9.7
machine 9.7
technology 9.6
design 9.6
empty 9.4
construction 9.4
indoor 9.1
water 8.7
apartment 8.6
roof 8.6
dining 8.6
electricity 8.5
energy 8.4
elegance 8.4
people 8.4
work 8
decoration 8
tables 7.9
3d 7.7
fuel 7.7
gas 7.7
pollution 7.7
environment 7.4
center 7.3
travel 7
life 7
mercantile establishment 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.2
furniture 97.8
indoor 97.4
table 96.7
person 89.8
chair 88.1
black and white 83.1
library 79.5
black 71.2
desk 69.1
clothing 53.7
book 52.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-67
Gender Male, 88.4%
Happy 50.1%
Calm 29%
Sad 18.4%
Angry 1.2%
Surprised 0.4%
Confused 0.4%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 18-30
Gender Female, 68.8%
Calm 88.4%
Happy 10%
Sad 1.4%
Angry 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.9%

Categories

Text analysis

Amazon

HARNESS
4
CAMBRIDGE
CAMBRIDGE 4 HARNESS LION
LION

Google

WANBRDGE 4HARNESS LUN
WANBRDGE
4HARNESS
LUN