Human Generated Data

Title

Untitled (children seated at birthday party)

Date

1948

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6214

Human Generated Data

Title

Untitled (children seated at birthday party)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.9
Person 99
Human 99
Person 98.3
Chair 97.3
Person 97
Indoors 96.1
Interior Design 96.1
Shoe 95.2
Footwear 95.2
Apparel 95.2
Clothing 95.2
Chair 94.7
Person 92.2
Room 92.1
Person 90.9
Chair 89.8
Table 89.1
Person 88.4
Person 84
Person 83.7
Person 79.9
Chair 79.6
Tabletop 71.4
Dining Table 71.3
Porch 70.2
Shoe 65.9
Cafe 60.6
Restaurant 60.6
Classroom 60
School 60
Person 59.5
People 57.2
Crowd 56.4
Person 48.8

Imagga
created on 2022-01-22

chair 56.8
seat 31.4
room 30.4
table 29.9
furniture 27.1
interior 25.6
folding chair 23.7
brass 19.6
people 18.4
modern 16.8
floor 16.7
restaurant 15.7
wind instrument 15.6
sitting 15.4
classroom 15.3
house 15
chairs 14.7
business 14.6
indoors 14
empty 13.7
man 13.4
inside 12.9
tables 12.8
male 12.8
wood 12.5
lifestyle 12.3
window 12.2
group 12.1
work 12.1
design 11.8
hall 11.8
glass 11.7
musical instrument 11.6
dining 11.4
urban 11.3
home 11.2
dinner 10.9
person 10.9
relaxation 10.9
computer 10.7
deck 10.4
men 10.3
photographer 10.1
equipment 9.9
cafeteria 9.9
style 9.6
lunch 9.6
architecture 9.4
communication 9.2
office 9.2
team 8.9
trombone 8.8
building 8.8
scene 8.6
contemporary 8.5
relax 8.4
outdoor 8.4
elegance 8.4
drink 8.3
coffee 8.3
device 8.3
silhouette 8.3
outdoors 8.2
structure 8.2
light 8
body 8
working 7.9
gymnasium 7.8
television camera 7.7
summer 7.7
furnishing 7.7
wall 7.7
city 7.5
leisure 7.5
plant 7.5
row 7.4
cornet 7.3
professional 7.3
meal 7.3
tourist 7.2
decor 7.1

Google
created on 2022-01-22

Furniture 93
Chair 90
Black 89.7
Style 83.8
Black-and-white 83.3
Table 80.9
Adaptation 79.4
Monochrome photography 76.1
Monochrome 75.7
Font 75.4
Room 70.3
Art 66.4
Folding chair 65.4
Event 64.8
Stock photography 64.1
History 62.2
Sitting 59.2
Visual arts 57
Toddler 55.2
Child 54.9

Microsoft
created on 2022-01-22

text 99.4
chair 95.6
furniture 86.4
person 69.2
black and white 68.5
table 64.7
clothing 51.5

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Calm 51.8%
Happy 45%
Sad 1.1%
Surprised 0.7%
Disgusted 0.5%
Confused 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 99.6%
Calm 98.3%
Happy 1%
Confused 0.2%
Sad 0.2%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Male, 99.7%
Calm 99.8%
Angry 0.1%
Surprised 0%
Sad 0%
Happy 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 48-56
Gender Female, 76.5%
Calm 99.9%
Disgusted 0%
Happy 0%
Surprised 0%
Confused 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 87.3%
Calm 90.8%
Surprised 4%
Sad 1.5%
Angry 1.3%
Fear 0.9%
Happy 0.8%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Female, 65.8%
Surprised 85.7%
Calm 7.5%
Happy 2.8%
Angry 1.3%
Sad 1.3%
Fear 0.7%
Confused 0.4%
Disgusted 0.2%

AWS Rekognition

Age 38-46
Gender Male, 94.6%
Calm 83.4%
Sad 14.3%
Disgusted 0.6%
Happy 0.4%
Confused 0.4%
Angry 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 47-53
Gender Male, 99.7%
Calm 73.7%
Sad 15.2%
Happy 4.5%
Confused 1.9%
Surprised 1.8%
Disgusted 1.4%
Angry 1%
Fear 0.6%

AWS Rekognition

Age 45-53
Gender Female, 75.1%
Calm 69.6%
Happy 15%
Sad 11.7%
Surprised 1.8%
Fear 0.6%
Angry 0.5%
Disgusted 0.4%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 97.3%
Shoe 95.2%

Captions

Microsoft

a group of people sitting in chairs 82.2%
a group of people sitting in a chair 76.6%
a group of people sitting in chairs in front of a building 76.5%

Text analysis

Amazon

EES
KODVK-SVEELA