Human Generated Data

Title

Untitled (wedding group portrait)

Date

1913

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.900

Human Generated Data

Title

Untitled (wedding group portrait)

People

Artist: Bachrach Studios, founded 1868

Date

1913

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.900

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 98.1
Person 97.9
Person 95.4
Person 95.1
Person 93.7
Painting 92.7
Art 92.7
Person 91.7
Person 88.5
Person 87.4
Person 86
Person 85
Person 84.8
Person 79.2
Person 78.2
Wood 76.6
Clothing 71.4
Apparel 71.4
Person 70.8
Flooring 69.9
People 69.8
Person 41.9

Clarifai
created on 2023-10-26

people 99.8
group 99.7
woman 98
art 98
child 97.7
print 96
family 96
adult 94.5
wear 94.3
portrait 94
many 93.5
man 93.5
son 93.4
painting 92
room 90.4
furniture 89.7
education 88.9
sepia 88
dress 87.4
music 86.7

Imagga
created on 2022-01-23

web site 35.7
vintage 22.3
money 18.7
old 17.4
postmark 16.7
art 16.7
envelope 16.6
letter 16.5
cash 16.5
stamp 16.4
mail 16.3
currency 16.1
postage 15.7
postal 15.7
banking 15.6
window 15.5
culture 14.5
insulating material 14.5
one 14.2
paper 14.1
circa 13.8
bank 13.4
post 13.3
bill 13.3
unique 13.3
retro 13.1
symbol 12.8
printed 12.8
business 12.7
financial 12.5
antique 12.4
creation 12.3
newspaper 12.2
savings 12.1
ancient 12.1
dollar 12
black 12
closeup 11.4
home 11.2
museum 11
finance 11
global 10.9
building material 10.9
close 10.8
shows 10.8
design 10.8
dollars 10.6
cutting 10.6
office 10.6
communications 10.5
painted 10.5
fine 10.5
product 10.3
house 10
post mail 9.9
zigzag 9.9
masterpiece 9.9
wealth 9.9
fame 9.9
known 9.9
room 9.8
painter 9.8
paintings 9.8
reflection 9.8
delivery 9.7
exchange 9.5
man 9.5
icon 9.5
rich 9.3
aged 9
person 9
representation 9
frame 9
renaissance 8.8
banknotes 8.8
pay 8.6
grunge 8.5
sculpture 8.4
decoration 8.4
church 8.3
drawing 8.3
paint 8.1
history 8
architecture 7.8
people 7.8
bills 7.8
portrait 7.8
us 7.7
card 7.6
screen 7.6
economy 7.4
background 7.3
investment 7.3
picture 7.1
market 7.1
male 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

gallery 99.6
text 98
room 97.5
scene 95.4
person 95
clothing 90.6
mammal 72.5
posing 64
man 51.5
picture frame 31.7
several 11.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 85.5%
Calm 62.9%
Confused 12.5%
Angry 11.6%
Sad 6.4%
Surprised 2.9%
Fear 1.5%
Disgusted 1.4%
Happy 0.7%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 96.5%
Confused 1.5%
Sad 0.6%
Angry 0.4%
Happy 0.4%
Surprised 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 100%
Calm 77.8%
Sad 12.5%
Confused 3.5%
Angry 1.9%
Disgusted 1.5%
Fear 1.2%
Surprised 1.1%
Happy 0.6%

AWS Rekognition

Age 28-38
Gender Male, 99.6%
Sad 52.4%
Calm 35.3%
Fear 4.1%
Happy 1.9%
Surprised 1.9%
Confused 1.8%
Angry 1.4%
Disgusted 1.1%

AWS Rekognition

Age 6-16
Gender Female, 100%
Sad 98.7%
Calm 1.1%
Confused 0.1%
Fear 0.1%
Angry 0%
Surprised 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 28-38
Gender Male, 99.9%
Calm 99.7%
Sad 0.2%
Happy 0%
Confused 0%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 97.9%
Sad 81.7%
Calm 16.4%
Angry 0.6%
Confused 0.5%
Surprised 0.2%
Fear 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 98.8%
Sad 0.3%
Angry 0.2%
Confused 0.2%
Surprised 0.2%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 30-40
Gender Male, 100%
Calm 97%
Sad 1.1%
Confused 0.8%
Disgusted 0.4%
Angry 0.4%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 38-46
Gender Male, 96.9%
Happy 33.9%
Calm 28.8%
Sad 15.6%
Confused 6%
Disgusted 4.7%
Fear 4.7%
Angry 3.8%
Surprised 2.6%

AWS Rekognition

Age 20-28
Gender Female, 99.6%
Calm 98.7%
Sad 0.5%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Female, 100%
Calm 99.3%
Angry 0.5%
Confused 0.1%
Sad 0.1%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 95.4%
Calm 99.7%
Surprised 0.1%
Confused 0.1%
Happy 0%
Sad 0%
Fear 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 26-36
Gender Male, 97.2%
Calm 98.5%
Sad 0.5%
Angry 0.4%
Surprised 0.2%
Confused 0.1%
Fear 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 14-22
Gender Female, 91.1%
Sad 73.3%
Calm 24%
Happy 0.9%
Confused 0.6%
Angry 0.4%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 34-42
Gender Male, 79.2%
Calm 93.4%
Sad 1.6%
Happy 1.1%
Fear 0.9%
Confused 0.9%
Angry 0.9%
Surprised 0.8%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 92.7%

Categories

Imagga

paintings art 99%

Text analysis

Amazon

P42
P42 100%
100%
32
Florence
hade
que
Thomas
Cafe
hereis
#63
have
1.18 #63
BMorton
1913
lew
DGW
N.Putly
Markey
H.hincola
Nurres
Element
Stamber
guitters
1.18
Crass

Google

BHorten N. Putly Hhineola Florena Rafenca P42 190% 32 hadi
N.
Putly
Rafenca
P42
BHorten
Hhineola
Florena
190%
32
hadi