Human Generated Data

Title

Untitled (old photographs taped to wall)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16791

Human Generated Data

Title

Untitled (old photographs taped to wall)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16791

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.2
Human 98.2
Person 97.1
Person 96.3
Person 95.4
Advertisement 92
Collage 91.1
Text 77.5
Face 75.7
Art 73.5
Drawing 65.9
Clothing 60.7
Apparel 60.7
Paper 58.5
Flyer 57
Brochure 57
Poster 54.2
Poster 53.2

Clarifai
created on 2023-10-28

people 99.9
adult 99.6
illustration 99.1
group 97.3
woman 97.2
print 97.1
man 97
one 97
painting 96.2
art 95.6
war 94.1
two 93.9
wear 89.9
portrait 89.4
administration 88
leader 86.9
child 85.5
text 84.6
religion 84.5
many 83.1

Imagga
created on 2022-02-26

envelope 21.1
paper 19.7
old 19.5
vintage 19
wall 17.1
stamp 16.9
book 16.4
covering 15
texture 14.6
black 14.4
book jacket 14.4
bookmark 14.1
office 13.4
binding 13.1
blackboard 12.8
frame 12.5
symbol 12.1
art 11.8
board 11.6
jacket 11.2
blank 11.1
container 10.8
retro 10.6
binder 10.6
letter 10.1
business 9.7
protective covering 9.6
wrapping 9.5
empty 9.4
box 9.1
one 9
object 8.8
home 8.8
culture 8.5
grunge 8.5
decoration 8.5
design 8.4
device 8.2
message 8.2
global 8.2
detail 8
icon 7.9
printed 7.9
postage 7.9
chalkboard 7.8
paintings 7.8
antique 7.8
mail 7.7
money 7.7
painted 7.6
post 7.6
sign 7.5
die 7.5
note 7.3
product 7.3
square 7.2
architecture 7.2
museum 7.1
financial 7.1
interior 7.1
painter 7

Microsoft
created on 2022-02-26

drawing 96.7
cartoon 96.1
text 91.9
person 82.6
poster 79.6
clothing 79.3
gallery 74.1
sketch 62.1
painting 56.7
book 55
handwriting 51.9
old 48.8
stone 4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 91.6%
Calm 69.7%
Sad 12.9%
Happy 9.2%
Surprised 3.2%
Confused 2.6%
Angry 1.4%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 30-40
Gender Female, 99.9%
Calm 80.9%
Happy 14.5%
Surprised 1.8%
Sad 1.2%
Fear 0.9%
Disgusted 0.3%
Angry 0.3%
Confused 0.2%

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Calm 58.4%
Sad 32.9%
Surprised 4.4%
Confused 2%
Fear 1.7%
Angry 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Female, 84.5%
Calm 82.1%
Sad 6.7%
Happy 3.8%
Disgusted 2.6%
Fear 2.4%
Surprised 1%
Confused 0.7%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Poster
Person 98.2%
Person 97.1%
Person 96.3%
Person 95.4%
Poster 54.2%
Poster 53.2%

Categories

Captions

Microsoft
created on 2022-02-26

an old photo of a person 52.1%
an old photo of a person 52%
old photo of a person 48.5%

Text analysis

Amazon

1918
1923
aug. 1923
aug.
HATTY
KBOOK

Google

1018 aug. 1925 MJI7--YT3RA°2
1018
aug.
1925
MJI7--YT3RA°2