Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Playing games at Fellowship House.

Date

c. 1900

People

Artist: Apeda Studio, American active 1910s-1920s

Classification

Photographs

Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Playing games at Fellowship House.

People

Artist: Apeda Studio, American active 1910s-1920s

Date

c. 1900

Classification

Photographs

Machine Generated Data

Tags

Amazon

Furniture 99.9
Room 99.9
Indoors 99.9
Table 99.6
Person 98.6
Human 98.6
Person 98.4
Pool Table 98.4
Billiard Room 98.4
Person 98.1
Person 98.1
Person 97.5
Person 97
Person 96.9
Person 96
Chair 94.8
Person 94.5
Person 93
Person 92
Person 69.7

Clarifai

people 99.9
group together 99.7
many 99.6
room 99.1
leader 99.1
furniture 98.9
administration 98.7
group 98.1
chair 98.1
adult 97.9
home 97.5
man 94.4
military 93.9
war 91.7
indoors 90
several 89.6
woman 87.7
desk 85.6
recreation 84.4
wear 82.5

Imagga

room 88.7
interior 79.7
furniture 58.5
house 53.5
table 51.3
home 50.3
chair 42.7
counter 40.4
floor 38.2
decor 38
classroom 37.2
wood 35.1
modern 34.4
design 33.8
kitchen 32
lamp 31.6
luxury 30.9
window 29.7
architecture 27.4
apartment 26.8
indoors 24.6
anteroom 24.4
sofa 24
residential 24
inside 23
living 22.8
light 22.7
hall 21.9
dining 20.9
wall 20.4
indoor 20.1
comfortable 19.1
chairs 18.6
style 18.6
contemporary 17.9
domestic 17.2
glass 17.1
decoration 16.7
lighting 16.4
3d 16.3
residence 15.6
carpet 15.6
elegant 15.4
hotel 15.3
decorate 15.2
estate 15.2
chandelier 15
stove 14.8
couch 14.5
restaurant 14.1
hardwood 13.8
mansion 13.7
expensive 13.4
wooden 13.2
upscale 12.8
seat 12.7
plant 12.7
cabinet 12
cabinets 11.9
relax 11.8
bed 11.4
new 11.3
render 11.3
empty 11.2
cook 11
elegance 10.9
drawer 10.9
space 10.9
stylish 10.9
rug 10.9
living room 10.8
tile 10.6
real 10.4
food 10.3
building 10.2
nobody 10.1
dinner 10.1
dining room 9.9
refrigerator 9.9
tables 9.9
marble 9.8
oven 9.8
panel 9.7
lights 9.3
fireplace 9.2
relaxation 9.2
office 9.1
furnishings 8.9
steel 8.8
dwelling 8.8
bedroom 8.7
vase 8.7
classic 8.4
clean 8.4
fashion 8.3
stool 7.9
suburban 7.9
sink 7.8
real estate 7.8
comfort 7.7
structure 7.6
leather 7.6
area 7.5
warm 7.4
island 7.3
shelf 7.2
center 7

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 54.4%
Angry 45.1%
Happy 54%
Confused 45.1%
Disgusted 45.1%
Sad 45.1%
Surprised 45.1%
Calm 45.5%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Confused 45.5%
Sad 45.8%
Disgusted 45.2%
Calm 49.6%
Happy 47.7%
Surprised 45.5%
Angry 45.6%

AWS Rekognition

Age 29-45
Gender Male, 55%
Surprised 45.2%
Confused 45.2%
Disgusted 45.2%
Angry 45.1%
Calm 46%
Happy 53.1%
Sad 45.3%

AWS Rekognition

Age 23-38
Gender Male, 54.8%
Disgusted 45.4%
Surprised 45.3%
Confused 45.5%
Sad 47.9%
Happy 46%
Calm 49.4%
Angry 45.5%

AWS Rekognition

Age 20-38
Gender Male, 53.6%
Happy 45%
Angry 45.1%
Calm 45.1%
Sad 54.6%
Confused 45.1%
Disgusted 45.1%
Surprised 45%

AWS Rekognition

Age 45-66
Gender Female, 51.6%
Sad 46.3%
Calm 45.6%
Angry 46.8%
Happy 45.4%
Disgusted 50.1%
Confused 45.5%
Surprised 45.3%

AWS Rekognition

Age 30-47
Gender Male, 50.5%
Sad 49.5%
Disgusted 49.5%
Confused 49.5%
Happy 50.4%
Angry 49.5%
Surprised 49.5%
Calm 49.5%

AWS Rekognition

Age 20-38
Gender Male, 53.1%
Sad 46.7%
Surprised 45.4%
Disgusted 46.7%
Calm 47.8%
Angry 46%
Happy 46.9%
Confused 45.5%

AWS Rekognition

Age 20-38
Gender Male, 50%
Happy 49.5%
Sad 49.6%
Surprised 49.5%
Angry 49.6%
Calm 49.6%
Disgusted 50%
Confused 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.5%
Disgusted 49.5%
Surprised 49.6%
Happy 50.1%
Calm 49.7%
Angry 49.5%
Sad 49.6%
Confused 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Surprised 49.6%
Confused 49.5%
Disgusted 49.6%
Angry 49.6%
Calm 49.7%
Happy 49.8%
Sad 49.7%

Feature analysis

Amazon

Person 98.6%
Chair 94.8%

Captions

Microsoft

a vintage photo of a group of people standing in a room 96.3%
a vintage photo of a group of people in a room 96.2%
a vintage photo of a group of people standing around a table 91.9%

Text analysis

Amazon

HAP
22631-c
Tes:

Google

7.3
NAVY
HAP
32631-c
7.3 NAVY HAP 32631-c