Human Generated Data

Title

Untitled (group of people in an office in front of a painted portrait of a woman)

Date

1959, printed later

People

Artist: Lester Cole, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.502

Human Generated Data

Title

Untitled (group of people in an office in front of a painted portrait of a woman)

People

Artist: Lester Cole, American

Date

1959, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.502

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.9
Human 98.9
Person 98.5
Person 97.8
Furniture 97.5
Indoors 95.9
Library 94.8
Room 94.8
Book 94.8
Shelf 94.3
Person 92.3
Interior Design 92.1
Person 90.5
Bookcase 89.1
Tie 87.3
Accessories 87.3
Accessory 87.3
Table 80.7
Tie 67.9
Person 65.9
Desk 60.9
Living Room 58.2

Clarifai
created on 2023-10-15

people 99.9
adult 99.3
furniture 99
group 98.9
woman 97.6
man 96.9
room 95.9
education 95.3
sit 95.1
book series 92.3
leader 92.1
group together 91.8
newspaper 90.4
several 89.6
library 89.4
bookcase 89.3
writer 88.4
two 88
three 87.8
many 87.3

Imagga
created on 2021-12-14

furniture 30.4
bookcase 26.6
case 24.4
interior 23
computer 22.9
business 20.6
equipment 19.9
furnishing 18.2
technology 17.8
indoors 17.6
television 17.4
table 17.2
call 17
room 16.9
office 16.8
shop 16.1
telephone 14.9
modern 14
electronic equipment 13.9
people 13.4
pay-phone 12.7
desk 12.2
building 12.1
window 12.1
black 12
work 11.8
man 11.4
monitor 11.4
digital 11.3
person 11.3
inside 11
device 10.9
indoor 10.9
home 10.4
architecture 10.1
3d 10.1
old 9.7
working 9.7
design 9.6
laptop 9.5
professional 9.3
communication 9.2
house 9.2
adult 9.1
machine 9
chair 8.9
apartment 8.6
corporate 8.6
glass 8.5
industry 8.5
telecommunication system 8.5
restaurant 8.5
keyboard 8.4
mercantile establishment 8.4
hand 8.3
phone 8.3
businesswoman 8.2
style 8.2
light 8
worker 8
lifestyle 7.9
lamp 7.9
wood 7.5
city 7.5
vintage 7.4
holding 7.4
retro 7.4
back 7.3
data 7.3
decoration 7.2
art 7.2
night 7.1
male 7.1
information 7.1
job 7.1
businessman 7.1
screen 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 100
clothing 95.2
person 86.1
poster 56.4
cartoon 55.3
book 54.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 94%
Calm 76.9%
Sad 13.1%
Angry 6.8%
Fear 1.3%
Confused 0.7%
Surprised 0.4%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 29-45
Gender Male, 99.7%
Fear 89.2%
Surprised 7.5%
Calm 1.2%
Sad 0.9%
Angry 0.4%
Confused 0.4%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 21-33
Gender Male, 98.8%
Calm 97.3%
Happy 1%
Sad 0.8%
Angry 0.6%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 32-48
Gender Male, 99.9%
Calm 82.2%
Happy 7.6%
Sad 5.1%
Angry 3%
Surprised 1%
Confused 0.7%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Female, 98.3%
Calm 75.3%
Sad 16.3%
Angry 2.4%
Disgusted 1.6%
Surprised 1.4%
Confused 1.2%
Fear 0.9%
Happy 0.9%

AWS Rekognition

Age 20-32
Gender Male, 76.6%
Sad 66.4%
Calm 15.9%
Happy 10.2%
Angry 6.2%
Surprised 0.6%
Disgusted 0.3%
Fear 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Tie 87.3%

Categories

Imagga

interior objects 99.8%

Text analysis

Amazon

ALGRECO