题目内容 (请给出正确答案)
[主观题]

Which one is wrong about Equal-width (distance) partitioning and Equal-depth (frequency) partitioning?

A、Equal-width (distance) partitioning is the most straightforward, but outliers may dominate presentation.

B、Equal-depth (frequency) partitioning Divides the range into N intervals, each containing approximately same number of samples.

C、The interval of the former one is not equal.

D、The number of tuples is the same when using the latter one.

提问人:网友longman6202 发布时间:2022-01-07
参考答案
  抱歉!暂无答案,正在努力更新中……
如搜索结果不匹配,请 联系老师 获取答案
更多“Which one is wrong about Equal…”相关的问题
第1题
The purpose of correlation analysis is to identify redundant data.
点击查看答案
第2题
How to choose the optimal value for K?

A、Cross-validation can be used to determine a good value by using an independent dataset to validate the K values.

B、Low values for K (like k=1 or k=2) can be noisy and subject to the effect of outliers.

C、A large k value can reduce the overall noise so the value for 'k' can be as big as possible.

D、Historically, the optimal K for most datasets has been between 3-10.

点击查看答案
第3题
Discretization means dividing the range of a continuous attribute into intervals.
点击查看答案
第4题
What’s the difference between eager learner and lazy learner?

A、Eager learners would generate a model for classification while lazy learner would not.

B、Eager learners classify the turple based on its similarity to the stored training turple while lazy learner not.

C、Eager learners simply store data (or does only a little minor processing) while lazy learner not.

D、Lazy learner would generate a model for classification while eager learner would not.

点击查看答案
第5题
What’s the major components in KNN?

A、How to measure similarity?

B、How to choose "k"?

C、How are class labels assigned?

D、How to decide the distance?

点击查看答案
第6题
Normalizing the data can solve the problem of different attributes have different value ranges.
点击查看答案
第7题
Which the following ways can be used to obtain attribute weight for Attribute-Weighted KNN?

A、Prior knowledge / experience.

B、PCA, FA (Factor analysis method).

C、Information gain.

D、Gradient descent, simplex methods and genetic algorithm.

点击查看答案
第8题
At classification stage KNN would store all instance or some typical of them.
点击查看答案
第9题
At learning stage KNN would find the K closest neighbors and then decide classify K identified nearest label.
点击查看答案
第10题
By Euclidean distance or Manhattan distance, we can calculate the distance between two instances.
点击查看答案
账号:
你好,尊敬的用户
复制账号
发送账号至手机
密码将被重置
获取验证码
发送
温馨提示
该问题答案仅针对搜题卡用户开放,请点击购买搜题卡。
马上购买搜题卡
我已购买搜题卡, 登录账号 继续查看答案
重置密码
确认修改
欢迎分享答案

为鼓励登录用户提交答案,简答题每个月将会抽取一批参与作答的用户给予奖励,具体奖励活动请关注官方微信公众号:简答题

简答题官方微信公众号

警告:系统检测到您的账号存在安全风险

为了保护您的账号安全,请在“简答题”公众号进行验证,点击“官网服务”-“账号验证”后输入验证码“”完成验证,验证成功后方可继续查看答案!

微信搜一搜
简答题
点击打开微信
警告:系统检测到您的账号存在安全风险
抱歉,您的账号因涉嫌违反简答题购买须知被冻结。您可在“简答题”微信公众号中的“官网服务”-“账号解封申请”申请解封,或联系客服
微信搜一搜
简答题
点击打开微信